post

Redux: Will Google Enter the Business Applications Market?

I am dusting off an old post I wrote more than two years ago, and while it shows my lazyness, I am doing it in the belief that the ideas I raised back then may soon get an answer … albeit not exactly the way I imagined.  And interestingly enough, just about every company I mentioned may have a role in it.  Or not.  After all, it’s just speculation.  So here’s the old post:

Google’s next killer app will be an accounting system, speculates Read/WriteWeb. While I am doubtful, I enthusiastically agree, it could be the next killer app; in fact don’t stop there, why not add CRM, Procurement, Inventory, HR?

The though of Google moving into business process / transactional system is not entirely new: early this year Nick Carr speculated that Google should buy Intuit, soon to be followed by Phil Wainewright and others: Perhaps Google will buy Salesforce.com after all. My take was that it made sense for Google to enter this space, but it did not need to buy an overpriced heavyweight, rather acquire a small company with a good all-in-one product:

Yet unlikely as it sounds the deal would make perfect sense. Google clearly aspires to be a significant player in the enterprise space, and the SMB market is a good stepping stone, in fact more than that, a lucrative market in itself. Bits and pieces in Google’s growing arsenal: Apps for Your Domain, JotSpot, Docs and Sheets …recently there was some speculation that Google might jump into another acquisition (ThinkFree? Zoho?) to be able to offer a more tightly integrated Office. Well, why stop at “Office”, why not go for a complete business solution, offering both the business/transactional system as well as an online office, complemented by a wiki? Such an offering combined with Google’s robust infrastructure could very well be the killer package for the SMB space catapulting Google to the position of dominant small business system provider.

This is probably a good time to disclose that I am an Advisor to a Google competitor, Zoho, yet I am cheering for Google to enter this market. More than a year ago I wrote a highly speculative piece: From Office Suite to Business Suite:

How about transactional business systems? Zoho has a CRM solution – big deal, one might say, the market is saturated with CRM solutions. However, what Zoho has here goes way beyond the scope of traditional CRM: they support Sales Order Management, Procurement, Inventory Management, Invoicing – to this ex-ERP guy it appears Zoho has the makings of a CRM+ERP solution, under the disguise of the CRM label.

Think about it. All they need is the addition Accounting, and Zoho can come up with an unparalleled Small Business Suite, which includes the productivity suite (what we now consider the Office Suite) and all process-driven, transactional systems: something like NetSuite + Microsoft, targeted at SMB’s.

The difficulty for Zoho and other smaller players will be on the Marketing / Sales side. Many of us, SaaS-pundits believe the major shift SaaS brings about isn’t just in delivery/support, but in the way we can reach the “long tail of the market” cost-efficiently, via the Internet. The web-customer is informed, comes to you site, tries the products then buys – or leaves. There’s no room (or budget) for extended sales cycle, site visits, customer lunches, the typical dog-and-pony show. This pull-model seems to be working for smaller services, like Charlie Wood’s Spanning Sync:

So far the model looks to be working. We have yet to spend our first advertising dollar and yet we’re on track to have 10,000 paying subscribers by Thanksgiving.

It may also work for lightweight Enterprise Software:

It’s about customers wanting easy to use, practical, easy to install (or hosted) software that is far less expensive and that does not entail an arduous, painful purchasing process. It’s should be simple, straightforward and easy to buy.

The company, whose President I’ve just quoted, Atlassian, is the market leader in their space, listing the top Fortune 500 as their customers, yet they still have no sales force whatsoever.

However, when it comes to business process software, we’re just too damn conditioned to expect cajoling, hand-holding… the pull-model does not quite seem to work. Salesforce.com, the “granddaddy” of SaaS has a very traditional enterprise sales army, and even NetSuite, targeting the SMB market came to similar conclusions. Says CEO Zach Nelson:

NetSuite, which also offers free trials, takes, on average, 60 days to close a deal and might run three to five demonstrations of the program before customers are convinced.

European All-in-One SaaS provider 24SevenOffice, which caters for the VSB (Very Small Business) market also sees a hybrid model: automated web-sales for 1-5 employee businesses, but above that they often get involved in some pre-sales consulting, hand-holding. Of course I can quote the opposite: WinWeb’s service is bought, not sold, and so is Zoho CRM. But this model is far from universal.

What happens if Google enters this market? If anyone, they have the clout to create/expand market, change customer behavior. Critics of Google’s Enterprise plans cite their poor support level, and call on them to essentially change their DNA, or fail in the Enterprise market. Well, I say, Google, don’t try to change, take advantage of who you are, and cater for the right market. As consumers we all (?) use Google services – they are great, when they work, **** when they don’t. Service is non-existent – but we’re used to it. Google is a faceless algorithm, not people, and we know that – adjusted our expectations.

Whether it’s Search, Gmail, Docs, Spreadsheets, Wiki, Accounting, CRM, when it comes from Google, we’re conditioned to try-and-buy, without any babysitting. Small businesses don’t subscribe to Gartner, don’t hire Accenture for a feasibility study: their buying decision is very much a consumer-style process. Read a few reviews (ZDNet, not Gartner), test, decide and buy.

The way we’ll all consume software as a service some day.

(Cross-posted @ CloudAve )

post

Bring On The Inquisition! Judge Should Not Let Google Evil-doers Get Away!

<rant>

inquisition Italian Judge Oscar Magi (photo @ TechCrunch , but I am not showing it for fear he might slap me with a privacy-invasion charge) has no idea what he’s dealing with.  He’s just allowed evil witches get away with only 6 months suspended jail sentence. (C’mon, why would their employer insist on covering up with the “do no evil” slogan?).

Not enough. Time to start a full-scale witch-hunt, bring in the Spanish Inquisition (OK, you’re Italian, but let’s face it the Spanish were the masters of this art), shut down Google, then the Internet, close libraries, burn all books and rid us of all evil!

</rant>

(Cross-posted @ CloudAve )

post

Using Picasa on Multiple Computers – The Updated Definitive Guide

Picasa

My 4-year old how-to guide, Picasa Photo Sync on Multiple Computers has attracted tens of thousands of viewers, and is still quite popular.  In fact too popular, thanks to Google.  I can’t believe people actually read it today and try to follow the advice therein… it’s and OLD post with outdated information.  I’ve long struggled trying to find a better solution… and now that I have it … drumroll … but wait, first things first:

What’s the problem with Picasa?

Picasa is my favorite photo management program, and hey, it’s hard to beat free!  Yes, I believe SaaS is the future of computing, and I do keep many photos online (just canceled Flickr Pro in favor of PicasaWeb), but quick-and-dirty manipulation of large image files en masse is still easier, faster on a local PC.  Or one of the computers I use – if only I could.  It’s hard to believe that Google, an undeniably Web-centric company would create an application that’s designed to be used by one single user and one single computer – that’s stone-age vision, and again, is very antagonistic to being a visionary Web company. 

Picasa does not save your edits in the image file itself, rather it uses a set of system files: picasa.ini files in every photo folder and a bunch of proprietary databases in two hidden system directories.  This is actually a good concept, you can experiment and safely revert back to the original –  trouble starts when you want to move to a new computer, or God forbid access your photos from multiple computers – some of the associated changes will come through, others won’t.  You will soon have multiple versions of the databases and sometimes of the images themselves, and that leads to chaos. 

Early Solutions

The original concept in my previous guide was based on syncing the hidden Picasa databases between all computers involved. It worked for a while… then I started to see corrupted databases, so I abandoned synchronization.  In the meantime wireless home networks became more robust, so instead of redundant chaos, the next best option was maintaining once central Picasa home-base, and accessing it from other computers via the network.  This could quite easily be done by mapping the main computer’s drive as a network drive, say P: (for Photos or Picasa), setting Picasa on all the satellite computers to forget the local Pictures folders and only scan the new P: drive. 

In this setup Picasa still had to index all images it read from the network and recreate a local database on the individual computers, so the solution was quite redundant – but worked relatively well.  Through a succession of new releases Google moved more information on user edits into the per-folder Picasa.ini files, so the system was able to rebuild the database almost completely.  Cropping and some other information was still missing, so you could never be 100% certain you were looking at identical version of your images.  The safest way to avoid confusion and different views of the same photos was to make a policy of only editing images on the “main computer” where they were stored, thus rendering all other networked computers to passive viewers only. 

There has to be a better solution.. one that allows any member of the family (and any user account) using any computer on a network to share the one and only Picasa database – view and edit all the same, with any changes, tagging, editing immediately saved no matter which computer is being used.   Yes, there is one – keep on reading :-)  But first some disclaimers:

  • I’ve tested the solutions below in Windows 7
  • They should work on Vista, too, and I believe there is a logical equivalent under XP, but I’ve never checked it
  • These solutions work for me, but I can not guarantee they will work for you – experiment at your own risk
  • Before making any changes, do back up your Picasa database (both photos and the system data)
  • Even if everything works, there’s no way knowing if a future Picasa release will change it all…
  • I’m not a Windows Guru, and make no claims that this is the best or most elegant solution – just one that works for me
  • I cannot provide individual support – you are welcome to comment / contribute below, and may get a response from another reader, but I can not make promises.

Now, we’re ready to rock and roll …

Sharing Picasa Between Multiple User Accounts on the Same Computer

You may only be interested in the multi-computer setup, but please read this chapter anyway, as we will build on the logic outlined here when we expand to a network setup.

Move your photo library to a public location

By default most photos are stored at user account specific image libraries, with a default path similar to this in Windows 7 and Vista:

C:\Users\username\Pictures  

You could fiddle around with sharing / security properties to enable other user accounts access this image folder, but moving your photos to the public folder is a much cleaner solution.  The new destination is:

C:\Users\public\Pictures

Although the easiest way to move folders is from Windows, it’s always better to do it within Picasa, to allow it’s databases be updated properly.   If you use nested folders, you’re in luck, you can just right click on the top-level folder, select “Move Folder”, pick the new destination, and you’re done.  (If you have nested folders but don’t see them in Picasa, change from “Flat View” to “Tree View” in the main View menu.)  If you have a lot of flat folders, this may be a cumbersome process, but it’s one-time only.

This was easy … now close Picasa and let’s get really started :-)  Two reminders before we start:

  • you’ll need to do all this using an account with Admin privileges
  • backup, backup, backup (your photos and system folders / files)

Move Picasa’s internal databases to a public folder

The internal Picasa databases are originally in two system folders in Windows 7 / Vista:

C:\Users\username\AppData\Local\Google\Picasa2

C:\Users\username\AppData\Local\Google\Picasa2Albums

You’ll need to create a new home for these two folders, for example this:

C:\Users\Public\PicasaLib

Now move the Picasa2 and Picasa2Albums folders to the newly created PicasaLib folder, so their new locations are:

C:\Users\Public\PicasaLib\Picasa2

C:\Users\Public\PicasaLib\Picasa2Albums

Well done. Too bad Picasa is still looking for these databases in the old place…

Trick Picasa into finding the new database location

At this point you should no longer have a Picasa2 and Picasa2Albums folder in your C:\Users\username\AppData\Local\Google\ folder – if you do, you likely copied them to the new destination instead of moving.  If that’s the case, please delete them now – we can’t have real folders with those names here, since we are going to replace them with Symbolic Links that look just like the deleted folders but will actually redirect Picasa  to the new location.

For the next steps even though you’re logged into a user account with Admin rights,  you will need an elevated command prompt. If you’re like me and can’t remember hot-key combinations, here’s how to get it: Click the Start menu and type cmd in the run box, but do not hit enter. Instead, find cmd.exe at the top of the list, right-click on it, then left-click Run as Administrator.

Now you’re in a command box that reminds you of good old DOS.  Navigate to the original Appdata folder:

cd  \Users\username\AppData\Local\Google   

Now type these lines exactly as you see them:

mklink /d Picasa2  C:\Users\Public\PicasaLib\Picasa2

mklink /d Picasa2Albums C:\Users\Public\PicasaLib\Picasa2Albums

You have just created two entries that look like the Picasa2 and Picasa2Albums folders but actually point to their newly created location.

Update and verify Picasa for each user

Open Picasa, go to Tools > Folder Manager and make sure only the new public destination is selected, nothing else – certainly not user specific libraries.

Repeat the above relocation steps for all other users on the same computer, and check their Folder Manager setting in Picasa.

You’re all set!  All users now have shared access to all public photos, and edits, changes, thumbnails..etc are all maintained in a central database instantly available to all users.

Warning: I have not tested what happens if multiple users try to update the databases at the same time, but I assume it is not a very good idea.  Best practice is probably avoid using quick User switching all, rather log out of one user before logging into another one, but at a minimum, even if you do quick switch, don’t leave Picasa open in two user accounts at the same time.

Picasa on Multiple Computers

This is what you’ve been waiting for…  we’re actually very close, the logic is surprisingly simple: map the drive that has our Picasa library and databases as a network drive, say P, then apply the tricks we’ve just learned doing the multi-user setup on the same computer, but now the symbolic links will point to the public folders on the P: drive, and voila!

Well, almost…too bad there are a number of quirks that we have to deal with first.  Let’s take them up one by one.

Your network layout

If you have a NAS drive, which is for passive storage only, accessed by several computers on the network, than the above solution will work, since you can map the NAS drive to the same drive letter on all computers.  But if your network is like mine, i.e. there is no NAS,  Picasa resides on one of the actively used computers which all others access, then you run into all sorts of trouble.  Here’s why:  Picasa stores the result of your “watched folders” configuration in a plain text file named watchedfolders.txt in the Picasa2Albums folder.  But we’ve just moved that folder to our shiny new PicasaLib to be shared by all instances of Picasa – that means they can not have different “watched folders” set per instances.

The problem is, the “main” computer will consider Picasa storage as its C: drive, while all others have to refer to it by another drive letter, since C: is reserved for their own hard disk.  If you have both P: and C: drives as “watched folders”, all hell breaks loose: Picasa will start copying the folders to the local computers, in the wrong folders, wrong labels, resulting in total chaos (I’ve been there…).  So once again, we’ll cheat: find a way to refer to the central PicasaLib under the same drive letter from all computers.

Re-mapping the “server”

Not a true server, but playing that role in this case: this is the computer that has all the Picasa files and that we’ve just set up for multi-user access in the previous exercise.  We want to use the P: designation, but can’t simply rename our main hard disk, nor can we map it as a network drive, so we’ll apply the symbolic link trick again: set up a link from the root folder to the public folder.  Steps:

Get an elevated cmd prompt (see details above)

cd \   (back to root folder)

mklink /d P  C:\Users\Public\   (create the symbolic link)

You now have what the system thinks is a P folder, and can use it in the Picasa “watched folders” definition.  Which means you need to start Picasa, then navigate to Tools > Folder Manager, and select \P\Pictures as the folder to watch – unselect everything else.  For a few minutes you’ll see heavy activity, and for a while Picasa may show duplicate folders, double the number of pictures you really have, but will eventually sort it all out. 

Setting up the “clients” or “satellite” computers

These are the computers that have no local Picasa information, we just use them to access our main libraries from the “server”.  Ideally we would just map the public drive on the server as a network drive under P: but we’ll run into a syntax problem.  On the main computer Picasa will prefix our watched folder setting, changes \P\Pictures to C:\P\Pictures, which works just fine on the main computer but not on the satellite ones.   Picasa’s watchedfolders.txt does not accept a \\Computername\Foldername designation, it has to look all local.

So we go back to our friend… yes, you guessed it right, symbolic links. But now even this old friend lets us down: we cannot define a symbolic link to a network folder, only a local one. Oh, well, we’ll outsmart the system again, by combining network mapping with symbolic linking: we’ll map the network drive to an interim name first, then link to this interim drive-name.  Steps:

From windows, map the \\MainComputername\Users\Public\ folder as drive O:

Make sure there is no Picasa2 or PicasaLib folder in \Users\username\AppData\Local\Google

Get an elevated cmd prompt

cd \  

mklink /d P  O

cd  \Users\username\AppData\Local\Google

mklink /d Picasa2  O:\PicasaLib\Picasa2

mklink /d Picasa2Albums O:\PicasaLib\Picasa2Albums

Repeat the last three steps for all user accounts that should access Picasa from this computer.  Logically now you would have to open Picasa and changed the watched folders to P\Pictures, but there’s no need: it’s already set up on the server machine and you’ve just told Picasa to pick all parameters and data from there.

In other words, you’re all set.  Needless to say, this only works as long as your “server” is turned on:-) but then all computers on your network will see the very same photos, and all editing, manipulation, face or geo-tags, albums… are immediately updated in the central database and reflected on all computers, no matter where you originate them.

Warning:  as stated before, I have not tested what happens when multiple users access / attempt to update Picasa data at the same time, but I assume it’s not a very good idea.  We’re ‘cheating’ here, Picasa was not designed to work in a multi-user environment, so let’s plat safe: only one computer and one user should access it any one time.

Conclusion

We’ve just turned a hopelessly single-user, single-computer product into a networked one. Sort of. :-)  It’s obviously just a workaround, and…well, read my disclaimers again.

Now, let’s remember, Google isn’t really a PC or local network software company. They are The Web Pioneers. I think the long term solution will be much stronger integration with Picasa Web Albums.  Currently you can mark your folders / albums to synchronize with the Web version, but it’s one way, from PC to Web only.  That’s not what I call full synchronization.  If you change anything in Web Albums, it’s not reflected back to your local Picasa library.  I believe the future is full two-way – actually multi-directional – synchronization, where Web Albums become the conduit between any number of client computers that access Picasa.  It’s not impossible, services like Syncplicity do similar synchronization – Google Picasa has to embed it in their Web capabilities.  It’s time for Google to create the seamless online / offline photo management environment.

(This how-to guide was originally posted as Part 1: the Problem and Part 2: the Solution @ CloudAve)

post

Time for Device Independent Data Plans

The Apple iPad event is still on, and the Internet is crumbling… Twitter barely crawls, CoveritLive isn’t exactly live, the major sites providing blog coverage are barely accessible… this is iKill – the day Apple Killed The Net. 🙁

But I want to talk about something more important:

iPad data plan

It’s a screenshot from Engadget’s coverage.  Yes, reasonable data plan prices. Except… how many of them do you need?  An iPhone data plan, too?   A data plan for your USB stick for the times you do need a “regular” notebook to work on?

Remember this?

rotaryphone

Yes, phones looked like that.  And there was a time when phone companies (Ma Bell) charged extra when you had more then one outlet in your home….

Remember the early days of cable TV?   You had to ( well, were supposed to) pay extra for each additional cable outlet.

How about the early days of the Internet, before wireless became pervasive?  Yes, ISPs expected us to pay extra for each outlet.

These anachronistic charges are all gone – we pay for the service, no matter what device we use to access it.

So why would wireless access be any different?  We will soon have an increasing number of devices, but the underlying service is the same.  In fact chances are when I use my iPad (which I don’t have), I will not be using my Netbook / Notebook, or browse the Net on iPhone, Google Nexus One … as a consumer I may own a variety of devices, but chances are I will only use them one at a time.

It’s time wireless providers wake up to the 21st century and charge for consumption on a per account (person) basis, not per device.

(Cross-posted @ CloudAve )

post

Google’s FeedBurner Social Isn’t Quite Ready. Back to TwitterFeed – for Now.

googtwitWhy bother with an intermediary when we can now have FeedBurner send our blog post to Twitter directly?

– I wrote in Startup Bloodbath in Social Media and I meant it.  But for now, we’re switching back to TwitterFeed.

The new Feedburner service that pushes blog posts to Twitter directly isn’t quite ready. Let’s just say it’s a bit too trigger-happy: it pushes an update after every “save”, even minor updates to already published posts.

This is so crappy, we’re switching CloudAve back to TwitterFeed – for now.  Because it is crappy enough for Google to fix it soon – and then we’re back to the original formula: no need for intermediaries.

(Cross-posted @ CloudAve )

post

Startup Bloodbath in Social Media?

Image credit: Evil Fish Google announced their own URL shortener. Great.  But some startups may be panicking.  The TechCrunch title says it all: Bit.ly Just Got Fu.kd: Facebook And Google Get Into The Short URL Game.

Of course bit.ly is not the only possible casualty, but they are the dominant one in the URL shortening space – or at least they have been so far…

But what most commentators haven’t noticed is another feature from Google: FeedBurner social, which might very well kill TwitterFeed.  Yes, why bother with an intermediary when we can now have FeedBurner send our blog post to Twitter directly?  Check out the URL for this very post on Twitter: it’s the shiny new goog.gl variety.

And it’s not over yet.. just as we’re absorbing what all this means, here’s news of Twitter testing business features, including the ability of multiple users posting on behalf of one organization..  Somehow I don’t think CoTweet, HootSuite and a bunch of others are too happy about it.

Are they all doomed?  Not necessarily – right now they all offer additional features (multiple accounts, scheduling, stats..etc), but nevertheless, it must not be very comforting when the Ultimate Giant enters their space…

Oh, yeah, I know … we’ll soon see the statements from all these startups welcoming Google, validating their markets…etc. 🙂

(Cross-posted @ CloudAve )

post

Don Dodge Dumps Microsoft After it Dumps Him

It’s less than two weeks ago that Microsoft let Don Dodge go, along with 5,000 other employees.  He parted gracefully, then soon posted:

Getting dumped by Microsoft was a life changing event…for the better. The future is very bright. The opportunities are amazing.

Don probably set a World Record in the speed of getting a new job offer:

Vic Gundotra at Google was the first one to contact me with an opportunity…90 minutes after the news of the layoff hit. That fast decisive action was refreshing, and such a contrast to the slow, secretive, bureaucracy at Microsoft

Not only the outreach was quick, but the entire hiring process concluded in days, which is highly unlikely for Google. Yes, Don Dodge is now with Google and it did not take long for him to dump the remainder of his Microsoft life:

  • Thanks Microsoft Outlook, but I’m going to Gmail.
  • Thanks Microsoft Office Office 2007, but I’m going to Google Docs.
  • Thanks Microsoft Windows Mobile 6.5, but I’m going to Google Android.
  • Thanks Microsoft Internet Explorer, but I’m moving to Google Chrome.

Yes, I’m sure Microsoft made the right move, getting rid of a well known public face of the company was all worth it, and now this very public slap in the face is just the icing on the cake.  Well, Google was smart enough to turn Microsoft’s loss into their own gain 🙂

Congrat’s to Don for landing on his feet extra-fast, and – to paraphrase his blog title – moving onto The REAL Next Big Thing.

(Cross-posted @ CloudAve )

post

Prezi Dazzles: Live Recording of a Social Media Class

I’ve said before: if you wanto to dazzle with your presentation, use Prezi.  The Prezi team did to presentations what Google did to email: throw away all pre-existing notions, re-think why and how we use email (presentations) and build something from scratch.  That’s how you get results that truly dazzle.

Of course that brings up the question of just how much you want to dazzle: probably not too much in the corporate world: as Prezi throws away all notions of what presentations are (used to be), there would be  too much “undoing”, too steep a learning curve.  PowerPoint and Enterprise are too deeply intertwined.  That said  Prezi is a great tool (online and offline) for superstar freelancers, small groups, or just about anyone who gets on stage and wants to … yes, dazzle.

But Prezi can make you dizzy 🙂 at least in the video below, played 10 times the original speed.  So hold on to your chair tight, and enjoy…

(Cross-posted @ CloudAve )

post

SaaS CEO on Improving Website Visitor to Trial User to Paying Customer Conversion

I don’t claim to be an expert in the area, so this is more a quick pointer then a real post. Well, too short for a post, too long for a tweet:-)

Duane Jackson, CEO of SaaS accounting provider Kashflow writes up his experience of using Google Analytics and Website Optimizer to fine-tune his site to increase conversion:

It turned out that of everyone that visited our registration page, only 45% of them actually went on to complete it. So over half of everyone that looked at our registration page sailed off into the sunset never to be seen again.
We’ve managed to gradually improve that to almost 70% by trying a few different things…

His conclusion:

I’m really pleased we’ve found the time and tools to do this. What really irks me is that we didn’t do this ages ago. I could sit down and calculate what our revenues and customer numbers would look like if we improved conversions like this years ago – but I’m scared to.

Every day that you’re not actively working on improving your conversion ratios is a day of lost opportunities.

You can do it, too at zero cost:-) Or if you want to turn pro level, you may want to check out HubSpot, the inbound marketing gurus.

(Cross-posted @ CloudAve )

post

USGS Now Embraces Twitter as Source of Earthquake Information

earthquake This time it was personal.  The earthquake hit three miles from my house.  It was a minor one, magnitude 3.7, but I felt it very strongly, albeit very shortly, too. Just a sudden kick in the butt, nothing more.  Perhaps that’s the difference between being right above the epicenter or feeling it remotely.

I jumped on Twitter, and I was among the first few to report the quake.  Within seconds there were dozens, then hundreds of reports.

Not that it was a surprise, we’ve seen Twitter become the primary initial news source be it earthquakes, fires, military coups…etc.  (For a while Google thought I was some  earthquake expert simply because I pointed out Twitter was the first to report quakes in Japan and China.)  But clearly, not all information on Twitter is reliable, as was the case of the fake LA earthquake video.

Wee need both speed and reliablity.  The first comes from the crowd – nothing can beat having millions of “reporters” on the field, wherever, whenever significant events happen.  But we typically do expect some form of verification, be it a traditional news agency, or in the case of earthquakes often USGS, the US Geological Survey.  Until recently the information flow was one-way.  But after yesterday’s quake I found an interesting link to the Google Maps mashup above. It’s created by @usgsted, the  USGS Twitter Earthquake Detector. Here’s the explanation:

In this exploratory effort, the USGS is developing a system that gathers real-time, earthquake-related messages from the social networking site Twitter and applies place, time, and quantity data to provide geo-located earthquake detection within 60 seconds of an event’s origin time. This approach also provides a central directory of short first-impression narratives and, potentially, photos from people at the hazard’s location.

Social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources.  People local to an event are able to publish information via these technologies within seconds of their occurrence. In contrast, depending on the location of the earthquake, scientific alerts can take between 2 to 20 minutes. By adopting and embracing these new technologies, the USGS potentially can augment its earthquake response products and the delivery of hazard information.

To be fair, the USGS has not been entirely deaf even before: once you locate the relevant quake info (which is quite an achievement in itself) there is a Did You Feel It? link where if you are really persistent, you can provide feedback.  The form is asking for a lot of data, takes a while to finish – enough to deter most.  Which is why the fact the USGS is now embracing Twitter is a major milestone: it combines the speed of crowdsourced reporting with the verification / authority of experts.

(Cross-posted @ CloudAve )