And yes, we've been using Jira for ages. We tried a bunch of other things and hated them. I've never liked gmail's user interface experience. But that might just be me being old and crotchety.
The problem with software as a service is that you have to send out floppy disks with the latest version of the client, and then deal with people having different kinds of modems with different init strings, and deal with modem banks, and it can be slow...
So yeah, once the network infrastructure was in place, SaaS started to make sense. I like the idea ... but I'm starting to be concerned over the fact that we're moving towards a world where you either work for Amazon or you don't work in IT at all.
Here's an idea: folder pane, message list pane, reading pane. Why can't Google understand that some people *like* it the traditional way?
I also don't like the message list as a vertical pane in between the folder pane and reading pane. I know it's designed for a wide screen, but even when you're given a choice (like with Outlook Web Access) the message list entries take up so much vertical space that it doesn't look right.
This is silly. It's something that *everyone* got right at some point, and was a universally accepted and understood user interface, and everyone's messing it up now.
No one is immune. The US Postal Service has introduced its new CloudCloud(tm) service, which delivers your postal mail in Teh Cloud.
They don't know how or if it works, but they're doing it anyway. Because the cloud.
Have we been discussing wether a tablet/phone/android/ios is a real computer or not in this room?
Anyway, there is a guy who writes about how an iPad became his main computer since about a year know:
You only have to take into account that his main job seems to be "writing a blawg" and he uses an iPad Pro...
I will try next month, if an iPad mini 2 could get me through the day for most of my tasks. With an added keyboard, maybe a Zagg folio or something else. Only thing that I can not imagine how to solve properly is interacting with LANs on client sites. While iOS is capable of using LAN (thanks to Apple TVs), it is a crutch. And using an Airport Express might be uncomfy.
There was a discussion about "real computers" but it was skewed by a confirmation bias attempting to place iOS devices in that category and exclude Android from it. Unfortunately the reality distortion field did not die along with The Steve Jobs.
A real computer is one powerful enough to serve the purpose it was designed for. The distinction between whether an "access device" counts as a computer or just as a terminal is extremely blurry because Netscape not only won, but totally DOMINATED the browser war. Too bad they died from their battle injuries.
But just about everything is server-side now.
Eh? Iirc, the "apple huggers" (me and ragnar(?)) put tablets in the "not a computer" corner, including ios ones. We might have refused to give android any credit because it is utter shit (and not a "real" linux), but phones and tablets were no computers because of missing full filesystem access, regardless of wether they were android or ios. That is at least how I remember it.
Tablets vs Computers is like buying eastern european cars (tablets) vs western european cars (computers):
If you need to get one, buy a Skoda (ios), not a Dacia (android).
If we're going that low, even a calculator these days is a "computer".
As for living in on a tablet - I did it recently for a week.
I quickly found scenarios where I missed a real computer. Photo edits, emailed documents and remote access to other systems were all sorely wanting.
But I survived, on a iPad Pro with keyboard and pencil.
That being said, I think that I would have preferred a Microsoft Surface for a lightweight travel system where I still wanted real computer functionality.
There was a discussion about "real computers" but it was skewed by a
confirmation bias attempting to place iOS devices in that category and
Yeah, skewed... by a trollin' ;)
There's no point in trying to define a "real computer" because it's an ambiguous term. As I pointed out above, a device becomes a "real computer" when it has enough power to perform the workload *you* need it for.
Is it a "real computer" when it can go online and access a larger host system?
Is it a "real computer" when it has a web browser and javascript stack?
Is it a "real computer" when arbitrary software can be loaded onto it?
Is it a "real computer" when it can crunch exabyte-size data sets?
A phone or tablet meets the first three criteria, but fails the fourth. But so does a Macbook.
By that standard, a Macintosh is not a real computer because you need a Lisa to develop for it.
I hear they may have worked around that limitation but I haven't seen it yet myself.
So why did they focus on the Mac, which was sort of an experimental side project at the time? There is some speculation that it was internal politics from The Steve Jobs throwing his weight around during his second tenure at the company.