post

Foxmarks, Xmarks, LastPass, Xpass, LastX, X%^&% Quick Rant

lastpass-xmarks-225Warning: I think I’m becoming a curmudgeon – except that title has until now been reserved for somebody else Smile.   But I still have doubts about the recent transaction: LastPass acquired Xmarks.

I really liked Xmarks – when it was Foxmarks.  A simple bookmark synchronization service that would keep your Firefox up-to-date no matter where you logged in.  Essential Cloud Computing when we’re no longer enslaved to one computer only.

Then it became Xmarks, started to offer password sync and several other services, including “enhanced” Google Search – i.e. adding a social layer to Google’s algorithm. I opted out of password sync, sticking to the basics.

LastPass, on the other hand was a solution for the password conundrum – so good, that Ben was ready to dismiss his usual concerns.  The transaction probably makes sense for both parties: Xmarks was going down the drain, having experimented with business models and running out of cash.  LastPass picks up millions of users.

So why am I ranting?

(Cross-posted @ CloudAve » Zoli Erdos)

post

Using Picasa on Multiple Computers – The Updated Definitive Guide

Picasa

My 4-year old how-to guide, Picasa Photo Sync on Multiple Computers has attracted tens of thousands of viewers, and is still quite popular.  In fact too popular, thanks to Google.  I can’t believe people actually read it today and try to follow the advice therein… it’s and OLD post with outdated information.  I’ve long struggled trying to find a better solution… and now that I have it … drumroll … but wait, first things first:

What’s the problem with Picasa?

Picasa is my favorite photo management program, and hey, it’s hard to beat free!  Yes, I believe SaaS is the future of computing, and I do keep many photos online (just canceled Flickr Pro in favor of PicasaWeb), but quick-and-dirty manipulation of large image files en masse is still easier, faster on a local PC.  Or one of the computers I use – if only I could.  It’s hard to believe that Google, an undeniably Web-centric company would create an application that’s designed to be used by one single user and one single computer – that’s stone-age vision, and again, is very antagonistic to being a visionary Web company. 

Picasa does not save your edits in the image file itself, rather it uses a set of system files: picasa.ini files in every photo folder and a bunch of proprietary databases in two hidden system directories.  This is actually a good concept, you can experiment and safely revert back to the original –  trouble starts when you want to move to a new computer, or God forbid access your photos from multiple computers – some of the associated changes will come through, others won’t.  You will soon have multiple versions of the databases and sometimes of the images themselves, and that leads to chaos. 

Early Solutions

The original concept in my previous guide was based on syncing the hidden Picasa databases between all computers involved. It worked for a while… then I started to see corrupted databases, so I abandoned synchronization.  In the meantime wireless home networks became more robust, so instead of redundant chaos, the next best option was maintaining once central Picasa home-base, and accessing it from other computers via the network.  This could quite easily be done by mapping the main computer’s drive as a network drive, say P: (for Photos or Picasa), setting Picasa on all the satellite computers to forget the local Pictures folders and only scan the new P: drive. 

In this setup Picasa still had to index all images it read from the network and recreate a local database on the individual computers, so the solution was quite redundant – but worked relatively well.  Through a succession of new releases Google moved more information on user edits into the per-folder Picasa.ini files, so the system was able to rebuild the database almost completely.  Cropping and some other information was still missing, so you could never be 100% certain you were looking at identical version of your images.  The safest way to avoid confusion and different views of the same photos was to make a policy of only editing images on the “main computer” where they were stored, thus rendering all other networked computers to passive viewers only. 

There has to be a better solution.. one that allows any member of the family (and any user account) using any computer on a network to share the one and only Picasa database – view and edit all the same, with any changes, tagging, editing immediately saved no matter which computer is being used.   Yes, there is one – keep on reading :-)  But first some disclaimers:

  • I’ve tested the solutions below in Windows 7
  • They should work on Vista, too, and I believe there is a logical equivalent under XP, but I’ve never checked it
  • These solutions work for me, but I can not guarantee they will work for you – experiment at your own risk
  • Before making any changes, do back up your Picasa database (both photos and the system data)
  • Even if everything works, there’s no way knowing if a future Picasa release will change it all…
  • I’m not a Windows Guru, and make no claims that this is the best or most elegant solution – just one that works for me
  • I cannot provide individual support – you are welcome to comment / contribute below, and may get a response from another reader, but I can not make promises.

Now, we’re ready to rock and roll …

Sharing Picasa Between Multiple User Accounts on the Same Computer

You may only be interested in the multi-computer setup, but please read this chapter anyway, as we will build on the logic outlined here when we expand to a network setup.

Move your photo library to a public location

By default most photos are stored at user account specific image libraries, with a default path similar to this in Windows 7 and Vista:

C:\Users\username\Pictures  

You could fiddle around with sharing / security properties to enable other user accounts access this image folder, but moving your photos to the public folder is a much cleaner solution.  The new destination is:

C:\Users\public\Pictures

Although the easiest way to move folders is from Windows, it’s always better to do it within Picasa, to allow it’s databases be updated properly.   If you use nested folders, you’re in luck, you can just right click on the top-level folder, select “Move Folder”, pick the new destination, and you’re done.  (If you have nested folders but don’t see them in Picasa, change from “Flat View” to “Tree View” in the main View menu.)  If you have a lot of flat folders, this may be a cumbersome process, but it’s one-time only.

This was easy … now close Picasa and let’s get really started :-)  Two reminders before we start:

  • you’ll need to do all this using an account with Admin privileges
  • backup, backup, backup (your photos and system folders / files)

Move Picasa’s internal databases to a public folder

The internal Picasa databases are originally in two system folders in Windows 7 / Vista:

C:\Users\username\AppData\Local\Google\Picasa2

C:\Users\username\AppData\Local\Google\Picasa2Albums

You’ll need to create a new home for these two folders, for example this:

C:\Users\Public\PicasaLib

Now move the Picasa2 and Picasa2Albums folders to the newly created PicasaLib folder, so their new locations are:

C:\Users\Public\PicasaLib\Picasa2

C:\Users\Public\PicasaLib\Picasa2Albums

Well done. Too bad Picasa is still looking for these databases in the old place…

Trick Picasa into finding the new database location

At this point you should no longer have a Picasa2 and Picasa2Albums folder in your C:\Users\username\AppData\Local\Google\ folder – if you do, you likely copied them to the new destination instead of moving.  If that’s the case, please delete them now – we can’t have real folders with those names here, since we are going to replace them with Symbolic Links that look just like the deleted folders but will actually redirect Picasa  to the new location.

For the next steps even though you’re logged into a user account with Admin rights,  you will need an elevated command prompt. If you’re like me and can’t remember hot-key combinations, here’s how to get it: Click the Start menu and type cmd in the run box, but do not hit enter. Instead, find cmd.exe at the top of the list, right-click on it, then left-click Run as Administrator.

Now you’re in a command box that reminds you of good old DOS.  Navigate to the original Appdata folder:

cd  \Users\username\AppData\Local\Google   

Now type these lines exactly as you see them:

mklink /d Picasa2  C:\Users\Public\PicasaLib\Picasa2

mklink /d Picasa2Albums C:\Users\Public\PicasaLib\Picasa2Albums

You have just created two entries that look like the Picasa2 and Picasa2Albums folders but actually point to their newly created location.

Update and verify Picasa for each user

Open Picasa, go to Tools > Folder Manager and make sure only the new public destination is selected, nothing else – certainly not user specific libraries.

Repeat the above relocation steps for all other users on the same computer, and check their Folder Manager setting in Picasa.

You’re all set!  All users now have shared access to all public photos, and edits, changes, thumbnails..etc are all maintained in a central database instantly available to all users.

Warning: I have not tested what happens if multiple users try to update the databases at the same time, but I assume it is not a very good idea.  Best practice is probably avoid using quick User switching all, rather log out of one user before logging into another one, but at a minimum, even if you do quick switch, don’t leave Picasa open in two user accounts at the same time.

Picasa on Multiple Computers

This is what you’ve been waiting for…  we’re actually very close, the logic is surprisingly simple: map the drive that has our Picasa library and databases as a network drive, say P, then apply the tricks we’ve just learned doing the multi-user setup on the same computer, but now the symbolic links will point to the public folders on the P: drive, and voila!

Well, almost…too bad there are a number of quirks that we have to deal with first.  Let’s take them up one by one.

Your network layout

If you have a NAS drive, which is for passive storage only, accessed by several computers on the network, than the above solution will work, since you can map the NAS drive to the same drive letter on all computers.  But if your network is like mine, i.e. there is no NAS,  Picasa resides on one of the actively used computers which all others access, then you run into all sorts of trouble.  Here’s why:  Picasa stores the result of your “watched folders” configuration in a plain text file named watchedfolders.txt in the Picasa2Albums folder.  But we’ve just moved that folder to our shiny new PicasaLib to be shared by all instances of Picasa – that means they can not have different “watched folders” set per instances.

The problem is, the “main” computer will consider Picasa storage as its C: drive, while all others have to refer to it by another drive letter, since C: is reserved for their own hard disk.  If you have both P: and C: drives as “watched folders”, all hell breaks loose: Picasa will start copying the folders to the local computers, in the wrong folders, wrong labels, resulting in total chaos (I’ve been there…).  So once again, we’ll cheat: find a way to refer to the central PicasaLib under the same drive letter from all computers.

Re-mapping the “server”

Not a true server, but playing that role in this case: this is the computer that has all the Picasa files and that we’ve just set up for multi-user access in the previous exercise.  We want to use the P: designation, but can’t simply rename our main hard disk, nor can we map it as a network drive, so we’ll apply the symbolic link trick again: set up a link from the root folder to the public folder.  Steps:

Get an elevated cmd prompt (see details above)

cd \   (back to root folder)

mklink /d P  C:\Users\Public\   (create the symbolic link)

You now have what the system thinks is a P folder, and can use it in the Picasa “watched folders” definition.  Which means you need to start Picasa, then navigate to Tools > Folder Manager, and select \P\Pictures as the folder to watch – unselect everything else.  For a few minutes you’ll see heavy activity, and for a while Picasa may show duplicate folders, double the number of pictures you really have, but will eventually sort it all out. 

Setting up the “clients” or “satellite” computers

These are the computers that have no local Picasa information, we just use them to access our main libraries from the “server”.  Ideally we would just map the public drive on the server as a network drive under P: but we’ll run into a syntax problem.  On the main computer Picasa will prefix our watched folder setting, changes \P\Pictures to C:\P\Pictures, which works just fine on the main computer but not on the satellite ones.   Picasa’s watchedfolders.txt does not accept a \\Computername\Foldername designation, it has to look all local.

So we go back to our friend… yes, you guessed it right, symbolic links. But now even this old friend lets us down: we cannot define a symbolic link to a network folder, only a local one. Oh, well, we’ll outsmart the system again, by combining network mapping with symbolic linking: we’ll map the network drive to an interim name first, then link to this interim drive-name.  Steps:

From windows, map the \\MainComputername\Users\Public\ folder as drive O:

Make sure there is no Picasa2 or PicasaLib folder in \Users\username\AppData\Local\Google

Get an elevated cmd prompt

cd \  

mklink /d P  O

cd  \Users\username\AppData\Local\Google

mklink /d Picasa2  O:\PicasaLib\Picasa2

mklink /d Picasa2Albums O:\PicasaLib\Picasa2Albums

Repeat the last three steps for all user accounts that should access Picasa from this computer.  Logically now you would have to open Picasa and changed the watched folders to P\Pictures, but there’s no need: it’s already set up on the server machine and you’ve just told Picasa to pick all parameters and data from there.

In other words, you’re all set.  Needless to say, this only works as long as your “server” is turned on:-) but then all computers on your network will see the very same photos, and all editing, manipulation, face or geo-tags, albums… are immediately updated in the central database and reflected on all computers, no matter where you originate them.

Warning:  as stated before, I have not tested what happens when multiple users access / attempt to update Picasa data at the same time, but I assume it’s not a very good idea.  We’re ‘cheating’ here, Picasa was not designed to work in a multi-user environment, so let’s plat safe: only one computer and one user should access it any one time.

Conclusion

We’ve just turned a hopelessly single-user, single-computer product into a networked one. Sort of. :-)  It’s obviously just a workaround, and…well, read my disclaimers again.

Now, let’s remember, Google isn’t really a PC or local network software company. They are The Web Pioneers. I think the long term solution will be much stronger integration with Picasa Web Albums.  Currently you can mark your folders / albums to synchronize with the Web version, but it’s one way, from PC to Web only.  That’s not what I call full synchronization.  If you change anything in Web Albums, it’s not reflected back to your local Picasa library.  I believe the future is full two-way – actually multi-directional – synchronization, where Web Albums become the conduit between any number of client computers that access Picasa.  It’s not impossible, services like Syncplicity do similar synchronization – Google Picasa has to embed it in their Web capabilities.  It’s time for Google to create the seamless online / offline photo management environment.

(This how-to guide was originally posted as Part 1: the Problem and Part 2: the Solution @ CloudAve)

post

Ma.Gnolia Data Loss – Is Your Data Safe?

Ma.gnolia, a social bookmarking service is down, lost all their user data and they don’t know if / when they can recover

This is as bad as it can get for any Web 2.0 service (and more importantly for users), and the backlash against Cloud services has already started.   My first reaction is taking Stowe Boyd’s approach – a quick overview of how safe my own data is.

Read More

Update: also read Krish’s post @ ClouDave: Magnolia Effect – Should We Trust The Clouds?

post

Startups, Remember: Transparency, Transparency, Transparency

  • How can people even think of launching a service without revealing the price upfront?
  • How can they expect users to go through the hassle of signing up, installing software, only to find the price info after all this?
  • Why do people still fall for this?

I’m discussing the above and more using Zumodrive’s launch as case study over @ CloudAve – read the details here.

post

Google Lockouts are not Fun. Are You Prepared?

Loren Baker, Editor of Search Engine Journal discusses his experience of getting his Google account frozen without a warning.  Nothing new, we see these cases every few months. If you’re a well-know blogger like Loren, getting resolution might take 15 hours –  I don’t even want to think how long it would take for less prominent users get their account issues fixed.

There are a few things we can all learn from Loren’s case:

  • Communication – $50 buys you Phone Support
  • Backup – offline, within Google or another Web service
  • Your Domain – should be a no-brainer for Branding reasons anyway, and when all hell breaks lose, allows to quickly switch to another provider.

I’m discussing these and other steps to avoid disruption on CloudAve. (To stay up-to-date on SaaS, Cloud Computing and Business, grab the CloudAve Feed here).

Reblog this post [with Zemanta]
post

Sync Update: Syncplicity, Dropbox, Windows Live (?) Mesh

Quick update to my recent Syncplicity review:

In the meantime Microsoft’s Windows Live Mesh opened to the public, combining synhcronization and backup – also competing with their own Foldershare.  Now a word on what will happen to Foldershare, but I guess the writing is on the wall.  That said,  Live Mesh just failed for me the second time, so I can’t really recommend it.

Another service, Dropbox is getting a lot of buzz nowadays, largely to a smart theme of giving out limited numbers of beta invitations.  Apparently artificially created shortage is good marketing, bloggers LOVE being able to give away 10 or so invites…

Dropbox has one advantage over Syncplicity: it’s multi-platform, including Apple’s OS X and Linux, whereas Syncplicity is Windows only for now.  But that’s where it ends: it has less features (forget Web Apps integration, e.g. Google, Zoho, Scribd, Picnik), and has what I consider a huge flaw:  you have to drop your files into a dedicated folder to be synchronized.   That may be reasonable if you want to collaborate on a limited set of files, but it simply does not resolve the “access to all my data anywhere, anytime” problem.  It’s certainly a show-stopper for me.

So if you’re waiting for a Dropbox invitation, you might as well try Syncplicity – you’ll love it.  And if you sign up here, you get 1G more, i.e. 3G of free storage instead of the standard 2G.

Update: I received a very good, constructive comment from Assaf, who pointed out this was a largely negative post.   In my mind this post is an extension of my original Syncplicity review, but now that I re-read it on it’s own, I agree with Assaf.  Please read my response here, that makes this post complete.

Zemanta Pixie
post

Syncplicity: Simply Excellent Synchronization, Online Backup and More

(Updated)
In today’s world where features are hyped as products and project teams masquerade as companies it’s truly refreshing to see a service that’s almost an All-in-One (OK, perhaps Four-in-One) in it’s category, which I would loosely define as protecting, sharing and synchronizing one’s data.

Recently launched Syncplicity:

  • Synchronizes your data across multiple computers a’la Foldershare
  • Provides secure online backup a’la Mozy
  • Facilitates easy online file sharing a’la box.net..etc
  • Integrates with  online services like Google, Zoho, Scribd, Picnik (somewhat like now defunct Docsyncer?)

An impressive list by all means.   Oh, and congrat’s to the team for finding an available domain name that’s actually a perfect description of what they do.  The simplicity part probably refers to the ease of installation and use not the task they perform in the background. smile_wink.

Getting Started
Registration, installation of the client is quick and easy, more importantly, after the initial configuration you can forget about the software – it works for you in the background non-intrusively, allowing you peace of mind.  You can leave it to Syncplicity to find all your document and media files or specify directories to be synchronized.  The process allows more granular control than Foldershare, where one of my gripes was that if I select My Documents ( a fairly obvious choice), I cannot exempt subdirectories, which results in conflict with some stubborn programs (e.g. Evernote).  With Syncplicity you can precisely fine-tune what you want synchronized, in fact they indicated that filename-based exclusion is in the development plan. (If you ever had your Picasa.ini files messed up by Foldershare, you know what I am talking about…)

Synchronization
The major difference compared to Foldershare is that Syncplicity is not a peer-to-peer product: it actually uploads your files to their servers, where they are encrypted (AES-256) and are available either to the Syncplicity clients on your other computers, or directly, via a Web browser.  This may be a show-stopper for some, and a convenience for others: unlike Foldershare, this approach does not require all synchronized computers to be online at the same time.  And since the files are stored online, it might as well be used as a backup service – this is where we enter Mozy-land.

Backup
The two major differences vs. Mozy are encryption and ease of restoring files from the backup set.
Mozy performs all encryption on your computer and even allows you to pick your private key: it can hardly be any safer (so safe, that if you lose the key, you’re files are gone forever).  Syncplicity transmits your files using SSL and the AES 256-bit encryption occurs in their data center, using a random key that is then sent off to a different location. Since they hold the key, there’s definitely a trust issue to ponder here.
Of course a backup solution is only as good as the restore, and, unlike Mozy, which will send a zip file hours after your request, then to be decrypted on your PC, accessing your files with Syncplicity couldn’t be any simpler.  Install the client on any PC and auto-download entire directories, or just browse the online version, check file revision history and pick what you’d like to download manually.

Sharing
Syncplicity offers both file and folder-level sharing: from your PC, right-click on any file to get a shareable link, which will allow anyone you email it to download the file from their website.  Or share entire folders to any email address, and the receiving party can either browse the folder’s online version, or, if they have the Syncplicity client installed, you both will have identical copies on your computers.  You can further specify view-only or edit access – the latter takes us into collaboration-land: updates made by any sharing party will be synchronized back to all other computers.  Be aware though that each party will still work on individual copies prior to save/sync, so with long multiple edits it’s quite possible to end up with several versions of the same document, due to Syncplicity’s conflict resolution.

This is why I believe real-time online collaboration is superior: there’s only one master copy, and no confusion between revisions.  This is what Google Docs and Zoho offer, and – surprise, surprise! – Syncplicity won’t let you down here, either.
They have created the best seamless offline/online integration I’ve seen with Google Docs: at the initial run your designated PC folder (e.g. My Documents) will get uploaded to your Google Docs account, and Google docs will be placed in a subdirectory on your computer.  From this point on you can edit these documents using Google, Word, Excel ..etc – your offline and online versions will be kept in sync.  This is pretty good, but not perfect: since Google docs only support a subset of the Word functions, after an online edit Syncplicity keeps two (and potentially more) versions of the same file – one with the latest changes, the other with a full set of Word functions “lost” in the conversion to Google.

Syncplicity’s most recently added online partners are:

  • Zoho – Right-click for the  ‘Edit in Zoho’ option.  Saving updates the document both on your computer, Syncplicity, but NOT on Zoho and Zoho (fixed, that was fast)
  • ScribdiPaper view of your files on the desktop.
  • Picnik – Right-click to choose “Edit in Picnik” for all your photos.

The Zoho integration presents a funny situation: you can now use Zoho Writer to save a file to your Google Docs space (Zoho>Desktop>Syncplicity>Google).  Not sure how practical this is, but I like the irony of a third party creating Zoho>Google integrationsmile_tongue.  On a more serious note, what I really would like to see is full Syncplicity<>Zoho integration, like it works with Google today (and since Zoho supports more Word functions, the conversion should be less lossy).  And while on the wish-list, how about sync-ing to Flickr?

Is it for you?
First of all, pricing: Free for two computers and 2G space, $9.99/month or $99 annually for any number of computers and 40G of storage.  You can sign up here to get 1G more, i.e. 3G of free storage, or 45G on paid accounts (using ZOLIBLOG as invitation code also works).  The price-tag is clearly heftier than, say Mozy, or free Foldershare, but there’s a lot more functionality you get – and oh, boy, when did box.net become so expensive?

The one potential downside is the fact that Syncplicity is a pre-funding startup. Will they survive?  This market has seen casualties (Docsyncer, Omnidrive?), successful exits (Mozy, Foldershare), and stable, ongoing services.  The answer is: who knows?   The Founders are ex-Microsofties, they’ve put an amazing service together in a very short time, so I’d put my chips on them, but in business there are no guarantees.

A better question to ask what you’re real risk is.  If online backup is critically important to you, and are already paying for a service like Mozy, I wouldn’t abandon it yet (Mozy is now owned by EMC, not going anywhere).  If you’re mostly just syncing currently, or don’t have a solid backup solution for now, there’s not much to lose. Even if Syncplicity were to disappear, your files will be replicated in several places, you don’t lose access.

In fact, by signing up, you help Syncplicity show traction, which is critical in the funding process, so you can help solidify their position.  Happy Sync-ing!smile_regular

Update (7/17): In the meantime Microsoft’s Windows Live Mesh opened to the public, combining synhcronization and backup – also competing with their own Foldershare.  Now a word on what will happen to Foldershare, but I guess the writing is on the wall.  That said, I Live Mesh just failed for me the second time, so I can’t really recommend it.

Another service, Dropbox is getting hyped a lot nowadays, largely to a smart theme of giving out limited numbers of invitations.  Apparently artificially created shortage is good marketing, bloggers LOVE being able to give away 10 or so invites…   Dropbox has one advantage over Syncplicity: it’s multi-platform, including Apple’s OS X and Linux, whereas Syncplicity is Windows only for now.  But that’s where it ends: it has less features (forget Web Apps integration), and has what I consider a huge flaw:  you have to drop your files into a dedicated folder to be synchronized.   That may be reasonable if you want to collaborate on a limited set of files, but it simply does not resolve the “access to all myy data anywhere, anytime” problem.  It’s certainly a show-stopper for me.

So if you’re waiting for a Dropbox invitation, you might as well try Syncplicity – you’ll love it.  And if you sign up here, you get 1G more, i.e. 3G of free storage instead of the standard 2G.

Update #2:  Congratulations to the Syncplicity team on their funding.

post

Windows Barely Live Mesh and Why TechCrunch Needs a New Tab

Steve Gillmor redefined TechCrunch today with a thoughtful but loooong (1709 words!) post on Windows Live Mesh. Others come to rescue translating him:

Robert Scoble: But, let’s translate Gillmor: Microsoft Mesh is fascinating. Agreed.

Phil Wainewright: Steve turns that around and points out that what Mesh is really about is connecting the desktop into the cloud

Mike Arrington: I’m pretty sure he’s saying Mesh = good.

Even Microsoft’s Steve Clayton is lost:

I got lost about two thirds of the way in to this post from Steve Gillmor but the first third was a great read. Actually the whole thing was but I just got a bit lost as I think some of the things going on in Steve’s fast thinking brain didn’t quite make it through to the keyboard so you’re left having to assume some things. I’m assuming he likes Mesh though. I think he does.

Commenters on TechCrunch were ruthless, I won’t even begin quoting them. But don’t get me wrong: this is a good article, which would have been a great fit for ReadWriteWeb, but the TC crowd expects short, to-the-point, fairly descriptive posts. In the words of TC owner Mike Arrington:

Steve is an acquired taste. his writing isn’t efficiently packaged into bite sized chunks like a lot of people have come to expect. but if you decide to give it the attention it needs, you may find that you come away a little bit smarter after you’re finished.

Yes. And perhaps Mike is trying to redefine TC’s style himself. But you have to know your readers, Mike – perhaps a a new tab for Essays would be appropriate – or if you want Gillmor’s writing part of the main flow, a graphical “grab a coffee this is a long one” icon would help.

Now, on to the bigger question, why Live Mesh is just Barely Live. (And yes, this will be a long post, too, but due to the screenprints.)

The first leaked news declared this a solution to “sync everything with everything”. Then came Amit Mital, Live Mesh General Manager with a visionary video and announcement at the Web 2.0 Expo last week, adding towards the end: initially it will sync only Windows PC’s, adding more platforms and devices over time. Ahh! So it’s a … Foldershare for now.

Minutes after the presentation I was chatting with a startup CEO who reminded me he had seen a similar video from Microsoft years ago: kid playing, Mom capturing video on cell-phone, family watching it almost real-time on various devices, executive-type Dad watching video on his laptop at an airport feeling “almost at home”. Great video, and yes, it was conceptually familiar, but what has materialized of it?

Live Mesh will be great when it really happens, but for now it’s largely waporware: pre-announcement, typical Microsoft-style. And now, if you’re still here, why don’t you follow me through the hoops of trying to sign up for (Barely) Live Mesh.

Google Search and several Microsoft blogs point to http://mesh.com so that’s where we start:

Hm… I could never figure out why I so often get signed out of Live Network (good old passport style), and if that’s the case why can I not sign back here. But that’s OK, we just take a detour to live.com, sign in and come back to mesh.com:

I though I had just signed in, but fine, let’s do it again. Oops:

The sign-in button changes to sign up – as in sign up for a waiting list. Not fun.. but let’s do it anyway. Btw, before the wait-list screen there was another screen where I had to agree to some terms – sort of usual for actually using a service, but for getting on a waiting list?

Now we’re in something called Microsoft Connect. Is this the same thing? Who knows…let’s click Register (but why, after sign-up, sign-in, agree, now register? WTF?)

I’m starting to really not like this. So far I’ve been presented with a maze of registration, confirmation, you-name-it screens, and I don’t know where the hell I am. Let’s backtrack a bit.

Oh, several screens above, at the waiting list signup, it stated on the next screen I should click Connection Directory, a small option on the top, not the main Register for Connect link… but who reads small prints, all screens should offer enough navigational clues to not get me lost. OK, redoing, now…

This jungle is the Connection Directory. No sign of Live Mesh, at least not on the first page. Text search to the rescue: there we are… somewhere towards the bottom (scroll way down) there is Live Mesh Tech Preview! Voila! (or not). The button to click is Apply Now! As if I hadn’t done it a zillion times already…

Hm.. I can do this now with my eyes closed… click.. click..click.

Geez, this looks like a plain old BS signup form again. I’ve had it. Done. I let others experiment with Microsoft’s Windows Dead Mesh. Let me know when it’s Live. For real.

post

The Cell-Phone Aware PC May Be a PC-less PC

Mike Egan @ Computerworld makes the case for PC’s to be smarter, with improved awareness of cell-phones, which means of their owners.

PCs would benefit greatly from awareness about the location of the user. Is she sitting in front of me? Is she out of the building? Imagine if your PC performed routine maintenance, or kicked into security mode when it knew you weren’t around. Since we take them wherever we go, cell phones are ideal devices to inform our PCs whether we’re in the room or not.

We like to set up our PCs just so, with color schemes and specific files and applications we like to use. Imagine if our phones could carry sets of configurations around and magically transform any PC we happen to be using into one set up just like the computer at home or in the office.

We work on documents, then go home and work on them some more. Why don’t phones automatically carry the latest version and upload it to whichever PC we’re using? Why do most of us still use e-mail for this?

A recent Gartner study discusses similar concepts named “Portable Personality Solutions.” Whether the media is thumb drives as in the Gartner study, or cell phones as in Egan’s vision, the core idea is the same: your preferences, your “digital personality” is always with you in your device, and is uploaded and downloaded wirelessly and automatically to whatever computer you want to use.

I like the concept, but it involves unnecessary steps: far too many uploads and downloads, a sure sign that it’s based on today’s computing model, instead of tomorrow’s. I laid out a similar but more far-reaching concept last year:

  • the mobile phone brings the connectivity, browser and some personalization
  • the actual work devices are the cheap displays, keyboards easily found anywhere.
  • the apps and data are on the Net

Can you spot the key difference? There is no computer. Yes, the PC is gone, the display and keyboard are there for convenience reasons (who doesn’t like large displays?) the mobile device can do the minimal processing I need since the heavy workload is carried in the Cloud. Granted this is not the solution for 3-D Modeling, Video Editing and the like, just for regular productivity work, which is what most of us use computers for anyway.

Now, to be fair, this is not really my concept, I was just interpreting Zoho CEO Sridhar Vembu’s personal computing nirvana vision. Recently he developed his vision a step further (actually, it’s not him dreaming further, it’s the technology that advances fast):

Given how mobile phones pack a whole lot of functionality in a tiny package, I have wondered if the ideal server farm is just tens of thousands of mobile phones packed together. It seems to me that the semiconductor technology behind mobile devices is far, far more power efficient than the stuff that goes in servers. Partly it is a backwards compatibility issue, with servers having to run code written all the way back to 1980s, while mobile phones simply didn’t exist that far back. Partly, it is also a function of how traditional client-server applications were architectural monoliths, compared to the deeply distributed “service-oriented architecture” that is common in web applications today.

With mobile phones approaching very respectable CPU & memory capacity, packaging them together as a server cluster makes a lot of sense. Linux can run on almost all of the modern CPUs common in cell-phones, and the mobile version of Java seems actually well-suited for server use, particularly for deeply partitioned, distributed applications. Lightweightness is actually an advantage in server software, just as it is in mobile software.

I wonder how far-fetched this vision is, but have to say this former Qualcomm engineer who just spent a few millions of dollars to create two data centers which will soon provide automatic failover might just know what he is talking about… smile_shades

Update: “Spanning Sync” Charlie is thinking along similar lines: Will Your Next PC be an iPhone?

Update (4/13): Is it Time For a Portable Dumb Terminal?

post

Windows "Live" (Now Dead) Foldershare Has an Architectural Weakness

Foldershare is a handy tool that keeps several PC’s in sync – most of the time, when it works.  Of course sometimes it goes down, defying it’s new Windows Live moniker. smile_embaressed

Unlike the previous, week-long outage, this one was just a few hours, but even now as it recovers, users can’t log in:

Outages are inevitable, but the repeated incidents made me realize that Foldershare has a design glitch: it’s dependence on logging in to a web server for no good reason.

  • Yes, I understand setup, customization is all through the Web.
  • However, once set up, the need to change configuration is rare, the whole idea in Foldershare is that it just runs in the background with the users barely noticing it even exists.  It does NOT sync / upload actual data to the Web server, all synchronization is strictly P2P.  In fact one of the setup options is to define whether you allow remote P2P sync to occur through the Net, or strictly on your LAN, behind the firewall.

Why on earth my Foldershare clients on 3 computers have to sign in to the Web to be able to carry out behind-the-firewall synchronization is beyond me.  Could the not cache the latest config locally, and use it whenever log-in fails?

Of course I have previously speculated that Microsoft should tie Foldershare and Skydrive, offering both PC sync and Web backup, in which case logging in becomes a reasonable requirement.  But even then, local sync should be available as a fall-back option for outages.

Update (2/13):  A day later Foldershare clients still can’t log in.  Perhaps it’s time to change “the next couple of hours” to “the next couple of days“. smile_angry