2018/08/01

The pendulum

When doing research about developing applications on both the Windows Desktop and the Windows Phone environment, the question comes to pass: shouldn't you "just" develop a web application?

When raising the question whether applications should be hosted on "a server" in "a datacenter", or whether everything should move to "the cloud", it's important to keep in mind that nothing´s new under the sun.

Let me tell you about a pendulum.



The pendulum is not a myth. It is a fact. From the dawn of computing, there has been one. To put it accurately, there have been many. The pendulum I'm talking about is a pendulum of doing thing local (on your own computer) and on a server (your "local" computer being only the terminal). Sometimes it swings to one side, sometimes to the other. Look at this:
  • During the 60's, every computing action took place on a local computer. Those were the days, my friend :-).
  • During the 70's, mainframe-terminal combinations became the way to go. An enormous leap, because from a regular workplace, you could access those large machines!
  • During the 80's PC's became powerful enough to do everything themselves
  • During the mid-90's till the '00s the Web Bubbles showed up. Everything was on the web and should be moved to the web. Google was booming
  • The '00s: Mobile devices became more powerful (Windows Mobile and Palm - iPhone only appeared in 2007). However the need for synchronization rose, the actual content was stored locally on your phone and PC. (Even with the first iPhone in 2007, this was still the case). Remember TomTom, running on top of PocketPC and having all maps on board?
  • The '10s (starting around 2008): Although mobile phones became even more powerful, the increasing number of devices per user and the availability of affordable broadband wireless networking lead to a shift back: smartphones started to interact directly with their servers, storing and downloading only the data needed to fulfill the user´s need. The rise of the cloud moved even more applications "back to the server".
The pendulum swings. At the extrema, borders are extended. When swinging back, technologies are abandoned - but the good parts remain.

    2014/11/20

    If you ever need to determine daylight saving time in Europe, here's the T-SQL script

    Or, as the Dutch call it, 'zomertijd' and 'wintertijd'

    CREATE TABLE #daylight_saving (
      date_utc DATETIME,
      is_dst BIT,
      date_cet DATETIME
    )

    INSERT INTO #daylight_saving (date_utc, is_dst, date_cet)
    VALUES
    ('2014-11-20 14:36', 0, '2014-11-20 15:36')
    , ('2014-11-20 23:59', 0, '2014-11-21 00:59')
    , ('2014-03-30 00:00', 0, '2014-03-30 01:00')
    , ('2014-03-30 00:59:59', 0, '2014-03-30 01:59:59')
    , ('2014-03-30 01:00:00', 1, '2014-03-30 03:00:00')
    , ('2014-10-26 00:59:59', 1, '2014-10-26 02:59:59')
    , ('2014-10-26 01:00:00', 0, '2014-10-26 02:00:00')

    SELECT
      CASE
        WHEN date_utc <
          DATEADD(
            HOUR
            , 1
            , DATEADD(
              DAY
              , -DATEDIFF(
                DAY
                , 6
                , CAST(YEAR(date_utc) AS char(4)) + '0331 00:00:00' -- last day of March
              ) % 7
              , CAST(YEAR(date_utc) AS char(4)) + '0331 00:00:00' -- last day of March
            ) -- last Sunday of March
          ) -- last Sunday of March 01:00 UTC = DST start
        OR date_utc >=
          DATEADD(
            HOUR
            , 1
            , DATEADD(
              DAY
              , -DATEDIFF(
                DAY
                , 6
                , CAST(YEAR(date_utc) AS char(4))+ '-10-31 00:00:00' -- last day of October
              ) % 7
              , CAST(YEAR(date_utc) AS char(4))+ '-10-31 00:00:00' -- last day of October
            ) -- last Sunday of October
          ) -- last Sunday of October 01:00 UTC = DST end
        THEN DATEADD (HOUR, 1, date_utc)
        ELSE DATEADD(HOUR, 2, date_utc)
        END date_cet_calculated
      , date_cet
      , date_utc
      , is_dst
    FROM #daylight_saving


    2011/02/28

    Even older than Raskin

    As promised, a post about Fitts's Law - the second post about GUI paradigms.

    Paul Fitts was an Ohio State Univerty psychologist who developed a model about human movement (wikipedia).  It turned out to be a very accurate and usable model, predicting how fast we (humans) can accurately aim at a target depending on the size and the distance of the target. This model is called Fitts's Law.

    The reason why Fitts's Law became so widely used in GUI design is because it's well applicable to the usage of pointer devices such as the mouse - it predicts, for example, why it's so easy to hit targets that are on the edge of your screen (and even easier to reach targets that are on the corners of your screen). For details, see the (somewhat dated) article on AskTog: First Principles of Interaction Design. Or, for a straight-forward explanation of the impact of Fitts's Law: see Particletree's Visualizing Fitts's Law.

    The reason why I post about it is the reason that we're still discovering new ways about how to apply a well-known law from 1954:
    - In 1954, Fitts's law was "discovered"
    - In 1984, Fitts's Law is used "for real" for the first time (I think) in a Graphical User Interfaces (GUI) with the Apple Macintosh, having a menu bar always on top of the screen which is very easy to target
    - In 2003, Apple introduced Exposé with Active Screen Corners, making effectively use of these easy-to-use areas to operate functions in your computer
    - In 2009, Windows 7 was introduced - using the edges for arranging your windows quickly - just smash a window to the top to maximise it, to the left to fill only the half of your screen et cetera.

    That means that, 54 years after Fitts developed his model about human movement, we're still finding new ways how to implement it the right way. That's a long time - about 10 generations. And it makes me look forward to what ways we'll find the next 30 years about how to operate touch devices :-)...

    2011/02/25

    About User Interfaces

    Last week I did some research for a tiny project I'm doing at the University of Amsterdam.

    This project is about the development of User Interfaces through the years. For example, back in olden days your monitor was... well... just a speaker! Imagine a processor running ticking at 8 kHz, and the checksum of the "input bits" being redirected to a speaker. It turns out you get a strange kind of melody. If you were to see a stream of dots passing by at this speed, you wouldn't recognize an unevenness in the pattern. But when one bit is missing, the pitch changes dramatically - so the monitoring proceeded by ear. dr. Gerard Alberts has posted some videos of these early computers but I can't find them right now (so I'll post 'em later).

    Anyhow, while looking for history on User Interfaces I stumbled across the book "The Humane Interface" by Jef Raskin. Ordered it at the university library, and read it last week. It's a really good read, and triggered me by thinking about how User Interfaces are supposed to work. Raskin explores User Interfaces from the viewpoint of humans and human shortcomings (very interesting!). Some human shortcomings (such as the time we need to switch from one task to another) can be exploited by UI's. Read the wiki page, and if you're not convinced, read the book :-). Some of the points Raskin states are (points are from Wikipedia, explanation is mine):
    • Modelessness. Computers shouldn't be in different "modes" where they react different - for example: using ctrl-c for copying text in one context and ctrl-c for terminating a program in another are different modes where the same action triggers different behavior.
    • Monotony of design. System / UI designers should figure out the best way to do something, and only implement that one - it confuses users when dozens of options are offered to do one task.
    • Every action must be undoable. Programmers have known this for long and name it version control ;-). The effect is, however, that save-buttons should be obsolete. A computer should never discard your work. This also diminshes the need for dialog boxes
    • Elimination of warning screens. People tend to automate tasks, almost making it a reflex to click away a dialog box.
    • Universal use of text. Icons are okay, but should be accompanied by text so it's obvious what they do
    • When you stop using your computer, you shouldn't have to wait for booting, but continue with the task where you left off
    Especially the undoable functionality struck me. Of course, there will be operations where it's (because of law restrictions) needed to explicitly save things. But still, a computer should simply never, never discard my work! When the lights go off and I turn my computer back on, my work should be still there.

    This was 2000. Eleven years ago - nearly two computer-lifespans (you remember, in 2000 you still played around with Win ME, when you were on Microsoft. The first stable Microsoft-consumer-OS - Windows XP - wasn't even completed at that moment!).

    What you should know, however, is that Jef Rasking designed the original Macintosh interface. Not all of his ideas were implemented, though. Today the new specifications of Mac OS X 10.7 (Lion) were made public. And imagine what? Three in a row:

    • Auto Save - "Say good-bye to manual saving. Auto Save in Mac OS X Lion automatically saves your work - while you work - so you don’t have to."
    • Versions - Versions records the evolution of a document as you create it. Mac OS X Lion automatically creates a version of the document each time you open it and every hour while you’re working on it. If you need to revert to an older version or retrieve part of a document, Versions shows you the current document next to a cascade of previous versions (...) so you can see how your work looked at any given time. 
    • Resume - If you’ve ever restarted your Mac, you know what’s involved. First you save your work, then close all your apps, then spend valuable time setting everything up again. With Resume, that time-consuming process is a thing of the past. Resume lets you restart your Mac — after a software update, for example — and return to what you were doing. With all your apps back in the exact places you left them. In fact, whenever you quit and relaunch an app, Resume opens it precisely the way you left it. So you never have to start from scratch again.
    Amazing - ideas from eleven years ago (technically already no problem for over a decade) still inspiring today's computer makers. Of course, Apple will claim this as their invention. We know it isn't, but still they're the first to include the features in consumer-grade OS - makes it (for me at least) worth considerable when buying my next computer..

    Next time I'll digg in some more UI-stuff with Fitts's Law (even older than Raskin)

    2011/02/24

    Twin-platform-development on Silverlight for Desktop and WP7

    It's been a while since I did my last post. I'm hoping to do more posts several weeks, and become eventually a real active blogger. Just kidding :-).

    From now on, I'll use this blog to posts about my research project (my bachelor's final project). The project will be about developing an application on different devices; whereby the focus of the project will be about the Windows Desktop and the Windows Phone. For me as a Linux-user pretty new, but I'm curious to the stuff Microsoft has to offer! (and for that sentence I will be brought to death before the Open Source Inquisition).

    At the end of the research, I will produce a whitepaper containing advices, best practices and pitfalls about developing on two different devices. Besides, I will deliver a Proof of Concept-application that I will develop myself. My research will be largely based on literature (scientific papers and articles), community content (weblogs, general articles) and written-down experience.

    The next couple of posts will be about my explorings of the Microsoft-universe; if you're interested about my quest and findings: stay tuned!

    2010/10/12

    Windows Phone 7

    I'm not fully convinced about the Windows Phone 7 ideas, but they have some cool videos really taking on me - one called "Really?" and another called "Season of the Witch".

    The cool thing about the concept is that Windows Phone is not another mobile OS immersing you in its experience. As Steve Ballmer stated: "You get in, out, and back to life." - so the videos are making fun of everyone always everywhere looking at their mobile phones. And they've got a point there. Maybe we have no need for another "beautiful phone" phones using even more of your time. Not another phone that you just "keep using" because it looks so beautiful and works so well. Maybe it's time for phones using less of your time, that don't go in the way of communicating with others, but still let you do the things you want to do.

    Or, maybe it's time for users that can handle priorities. Maybe it's not the phone that needs change. Maybe it's the people and their behaviour, which is not changed by a different type of mobile phone. Even then, the videos are still funny.

    To finish: one quote from the developer videos (announcing the most recent tools available): "Go get 'em. Be inspired, be awesome, stay nerdy" :-).

    Android 2.2 - update

    Ok, so even Android 2.2 isn't a silver bullet ;-).

    After using it for a while, I had to turn of JIT-compiling, because it made my phone .. well .. run slower. The only reason for that I can think of is a lack of internal memory. Binary code simply is larger - and thus uses more memory than bytecode does, and my two year old G1 - being the first Android handset - hasn't that much RAM  (around 74 MB, compare that to the myTouch and iPhone 4 having 512 MB).

    Still, it's an amazing system. A state-of-the-art Smartphone OS running quite smooth on quite old hardware. Thumbs up for Cyanogen (and a $10-donation).