Monthly Archives: March 2009

– Twitter, Facebook & Gov’t 2.0

Twitter Social Networking

Twitter Social Networking

“Web 2.0” is taking the Internet by storm. Use of Facebook (and similar sites) has exploded and may even have become passé for some people. Even that notorious bastion of anti-change troglodytes, the U.S. Congress, apparently loves Twitter.

But, amazingly enough, social networking tools may not be of much use to local government, unless there are significant improvements or new applications.

This subject of this blog is basically: how do social media companies and local governments need to change to really bring social networking “to the people”?

Why do local governments (cities and counties) even exist? The answer to this question is easy: these are the governments most visible and directly involved in the daily lives of most people (although you certainly wouldn’t know that by looking at newspaper headlines, the evening TV news and the blogosphere where the fedgov gets a lot more square inches of newspaper or computer monitor space).

Local governments take care of streets and parks, provide water and dispose of solid waste/wastewater. When you call 911 your local police or fire department responds, not the FBI or the Army. Local governments are very much connected to neighborhoods and individual communities. Almost everyone can walk into their county courthouse or city hall and ask for help or complain about a service. People can actually attend City Council meetings and make comments, or even – most cases – talk directly to the officials they’ve elected to run their city/county government.

In contrast, finding the right state or the federal government agency to address an issue – much less contacting them – is more difficult. Try walking into the U.S. Capitol to talk to your Congressperson!

But the bottom line is that local governments are very much in the “call and we’ll respond” mode.

It would seem that the “social media” – which are built for enhancing interaction and communications among individual people – are tailor made to work for local government. These tools, however, need some significant enhancements to be really useful. Here are some specific suggestions.

Use of Facebook has really exploded, especially among folks my age, which I’ll just say is the “over 50” set. Local government should really want to use this sort of social networking tool. We set up blockwatches, so people can let each other know about suspicious activities and crime in their neighborhoods. In Seattle, we have “SNAP” teams (“Seattle Neighbors Actively Prepare“) / try to train blocks of residents to be self-reliant and help each other after a disaster such as an earthquake when it may take days for help to arrive.

Facebook should be a natural application to allow neighbors to build stronger blockwatches or SNAP teams. But that’s not really the case. First, as a individual, I don’t necessarily want to share the same kind of information about myself with my neighbors as I do with my “friends” or relatives. That’s a serious deficiency of Facebook today, where my boss or co-workers as well as my “friends” and “relatives” – and now “neighbors” all might be Facebook “friends”. When I think about posting “25 things about myself”, I keep all those “relationships” in mind.

Next, there needs to be some relatively easy yet not overwhelming way for groups of neighbors on Facebook to communicate with their local government, and their government to communicate back. We (the City) want to hear about suspicious activities and get tips about crime. But clearly no police department can investigate the hundreds or thousands of such reports which might flow in daily from a thousand blockwatches which could be established in the City of Seattle. A really useful Facebook-like application would have an easy way to correlate these reports and allow neighbors to verify issues and support each other or at least sort out the “wheat” (real problems) from the “chaff” of perceived problems.

This is an issue on a daily basis but is ten times more important during an emergency situation or a disaster, when first responders are overwhelmed and reports of problems multiply.

On the other hand, a Facebook-like social networking tool might allow local government to quickly dispel rumors and calm out-of-control fears during those same situations. And, if structured correctly, the tool could allow the police to educate residents about keeping themselves safe. A Facebook-like application might allow the Fire Department/Public Health to be aware of health problems in neighborhoods, for example (with privacy controls) help neighbors check on and support the elderly or infirm in our neighborhood.

There are dozens of other uses I’m not mentioning – encouraging people to form and manage Parks Department sports teams, or find out about recreation opportunities or to join their neighborhood council for graffiti reduction or a neighborhood clean-up campaign. All these activities build community.

A second great service with similar application is Twitter. Twitter’s great strength is its short, 140 character statements, and the fact that one can tweet from cell phones and i-phones as well as computers. The applications for local government are legion, ranging from reporting public safety hazards – streetlights out, traffic accidents, potholes – to gaining a rapid, accurate assessment of what is happening during a major incident such as a gas line explosion, earthquake, power outage, the rantings of a CTO, or a plane crash-landing in the Hudson River.

Similarly, the city or county might be able to “tweet” the status of streets or traffic or snow emergencies, thereby informing people of emergent situations. Government twitterers could also be definitive sources of information, helping to quell rumors. But I think that tweets from on-the-scene “civilians” can play a major role in rumor-quelling and information gathering in and of themselves.

The problem with Twitter is just that it is so overwhelming. Mayor Gavin Newsom started to tweet a few weeks ago, and rapidly gained over 100,000 followers. Hey folks, there’s no way he can adequately respond to the @replies of 100,000 people!

We need some good way to link official twitter streams and @replies to City government service request systems or 311 services so duplicate reports are managed and government adequately acknowledges and responds to reports and requests. While you’re at it, Twitter could become GPS-enabled. Basically, that means your “tweet” about a pothole would automatically carry your present location along with it. In turn, if that pothole is scheduled for an asphalt bath, local government could immediately respond “that will be handled next Tuesday by noon”. Fedex delivery promises meet the local transportation department.

As a subset of Twitter and Facebook, I should also mention YouTube and Flickr services, which could allow people to post video of crimes or public safety issues or problems (or, god-forbid, the beauty and “what’s right”) of their neighborhoods as feedback to their governments.

Finally, I need to mention social networking and improving constituent input for the policy and legislative process. As I said above, one advantage of cities/counties is that people and walk right in and talk to elected officials or speak at Council meetings. But rarely do most people actually talk to their local council members, unless there is an issue of overwhelming concern. Usually special interest groups and gadflies provide feedback, while the interests and opinions of the vast majority of constituents are unknown. Every City has a “gang of 50” (or 10 or 100) who loudly give their opinions on almost any topic, while the ideas of the “silent majority of 500,000” (in Seattle’s case) are largely unknown.

Facebook, Twitter, LimeSurvey, Google Moderator and similar tools might provide a way to receive and better rank such input. Google Moderator was used by the Obama administration to allow people to post ideas, and then vote on them. Because a userid/password was required, a single individual could not overwhelm the voting process. Tools like Delicious can also be used for ranking. Visualization tools like Microsoft Virtual Earth or Flickr could be used in mashups to build visuals and gain comments on neighborhood plans, capital projects or parks improvements.

All these tools are in their infancy. They are not statistically valid measures, or even voter-valid measures (voter-valid means “elections”) for use by officials in formulating policy. These tools can produce a tremendous amount of data and opinions, but sifting that data and analyzing it into useful information is far beyond the current state of these tools. And the sheer amount of feedback and requests which people can generate to their government will rapidly overwhelm our ability to respond or even acknowledge it.

As almost an afterthought, I should mention the crying need for a working verison of audio and video search as key tools required to sift through data and make it into more useful information for government action.

As a final note, these tools could deepen the “digital divide” – the chasm between those people who have access to computers and Facebook and Twitter, and those who do not (although – as a bright spot in this – almost everyone has a cell phone, and you can tweet from a cell phone).

I’m convinced these new social media tools will make stronger neighborhoods and communities. They will improve the social fabric and cohesiveness of our society. But these tools need a lot of improvements and enhancements.

I hereby challenge the Facebooks and Googles and Twitters of the world to make those improvements happen.

“Yes you can.”

3 Comments

Filed under google, web 2.0

– The “P-I Test”

The Seattle Post-Intelligencer, click to see more</em>

The Seattle Post-Intelligencer, 1863-2009

Technology projects scare the hell out of me. And I’m a Chief “Technology” Officer! Tech projects are full of risk – there are two dozen career-ending ways projects can go south.

But how do you measure and control risk?

Bill Schrier’s first project management rule is the “P-I Test”.

Now, what is “P-I”?  No – it is not “private investigator”.   I named the “P-I test” after Seattle’s beloved daily newspaper, the Seattle Post Intelligencer. Although I’ve been using the “P-I test” for years, the real Seattle Post Intelligencer publishes its last paper edition today, March 17, 2009, after 146 years of publishing.

The P-I test is simple: if a particular technology project goes south, where will it show up in the Seattle P-I – front page above the fold? Local section, page 5, beneath a mattress advertisement? Or will the failure (hopefully) be off every reporter’s radar, that is, no one cares?

One of the big differences between government technology work and private industry is the P-I test. Only very rarely will a failed technology project from a private company make the newspapers. Private companies care about stock price, shareholder value and “face”. So their project failures are buried deeper and darker than the bottom of a coal mine.

But in government, everything is (at least eventually and theoretically), subject to public disclosure. Successful projects (or any good news, for that matter) rarely sell newspapers.

And, in an environment where taxpayer money is at risk and voters give a verdict on government leadership every four years (i.e. by electing a Mayor or City Council members), mitigating the risk of failing the PI test is “job one”.

But when projects are successful, no one notices! Indeed, that is possibly the truest measure of a successful tech project – implementation without notoriety.

Determining how newspapers or reporters or the public will determine failure is notoriously hard. I’ve had relatively trivial projects make headlines, like a simple mistake of sending an e-mail message with all recipients’ e-mail addresses in the clear, or having a bit of difficulty getting a Wi-Fi hotspot to work right.

On the other hand, implementation of a new billing system in 2002 for the City of Seattle’s electric utility – a project a year late and $10 million over its budget of $28 million – probably played a part in the end of the electric utility superintendent’s career.

I have a whole set of “Schrier’s project management rules” including (2) “hire somebody who knows what they are doing” and (3) “make sure the butt of someone in the business is on the line”. And I’ll write about those rules in a future blog entry.

But the real bottom line is that – since I took over as CTO in 2003 – the City of Seattle has not had a single significant information technology project failure. And we’ve done more than $100 million in projects! Credit for this string of successes belongs, not to me, but rather to Mayor Greg Nickels, who demands accountability from every department director, and to the Project Management Center of Excellence in my department, a dedicated set of four professionals who track and demand accountability on over 30 projects-in-progress.

I’ll write more about the other “rules” in Schrier’s project management lexicon. In the meantime, I’ll be extraordinarily sad about the last paper edition of the Seattle Post Intelligencer, publishing today, and the loss of the namesake of my first and most important project management tool, the “P-I Test”.

2 Comments

Filed under project management

– U.S.: Third World Broadband

Fiber Broadband - click for map

Fiber Broadband

The new fedgov stimulus bill was signed into law and it contains $6.3 billion to expand broadband in the United States.  Hooray!  The problem of Internet access in the United States is solved, right?

Hah!  Not by a long-shot.

The U. S. is 15th in the world in broadband penetration.  And our primary technologies used for broadband are still cable modems and phone companies’ Digital Subscriber Link (DSL).  Cable modems give relatively high speed – 6 to 30 megabits per second, but that speed is shared among dozens or hundreds of households.  And it is typically much slower “upload” rather than download.
DSL gives a dedicated connection to each user, but still, typically, at relatively low speeds such as 1, 2 or 7 megabits per second, and, again, much slower on the upload rather than download.

Now, you might think “gee a million bits a second is really fast”.  Yes, yes it is, if you are reading static websites or doing e-mail.  But the future of the “net” is video – and not the grainy, jerky (no pun intended), YouTube variety, but HDTV.  And HDTV requires 6 megabits per second each way.  Read on …

Most developed nations deploying “broadband” are NOT doing cable modems or coax or DSL or copper.  They are deploying fiber optic cable to each household and business. S eoul and Tokyo have deployed.  Amsterdam and Paris and Venice and Singapore are deploying.

A few forward thinking cities in the United States are – on their own – also deploying fiber to each premise.  Lafayette, Louisiana, Clarksville and Chattanooga and Pulaski and Jackson Tennessee are examples.  (See a great map of fiber deployments here.)

The beauty of fiber broadband is really high speed – 100 megabits-per-second or more, and true, two-way, symmetric networking.  These are networks capable of downloading whole movies in HDTV in a few minutes.  Or networks which can stream two-way HDTV so that every home/business can be an HDTV studio or a video conference/telework center or give people a phenomenal new Internet gaming experience.

Think about working at home, and joining meetings via HDTV video conference with quality so great you can actually watch your co-workers sweating.  With HDTV quality you can actually participate!  Or how about having your high school kid join a virtual HDTV classroom for that college-credit advanced placement class.  Or having your grandparents join you and their grandkids for dinner – several nights a week – using HDTV.  Think of the difference in their lives (maybe NOT yours!).
These same networks can be used to manage the energy use and carbon footprint of homes and businesses and buildings.  These are networks capable of telehealth and telemedicine – visiting your nurse or doctor from home and they can SEE you in HDTV.

And what will the fedgov broadband stimulus deliver?  Well, there is $2.5 billion for broadband to “rural areas” via the Department of Agriculture’s Rural Utilities Services.

In terms of urban areas, a lot of the requirements are still to be determined before $4.7 billion in stimulus grants are awarded.  The funds need to be spent in unserved or underserved areas.  But what does that mean?  Compared to the fiber deployments being undertaken elsewhere in the world, most places in the United States – other than those served by Verizon FIOS – are “underserved” because we only have DSL and cable.  How fast is this proposed stimulus-funded broadband?  Is it 256kb per second, or a megabit or 100 megabits?  Is it symmetric or is a very slow upload speed acceptable?

The fedgov NTIA ( National Telecommunications Infrastructure Administration) has published in the Federal Register an extensive list of such questions for us all to answer to help design their program.

I certainly hope this great new stimulus package will not just try to extend DSL or cable Internet and call that “broadband”.  I hope the NTIA and Agriculture stay true to the Obama administration’s goals of being bold, inventive, and innovative.  And, with this broadband stimulus, they don’t try to make the United States a “better” third world nation in terms of broadband, but rather sponsor projects which show the way for the future of a truly high-speed, two-way-HDTV-networked world.

4 Comments

Filed under broadband, Fedgov, fiber, video

– Microsoft vs Open Source

Microsoft Public Sector CIO Summit - click for moreThis week is the Microsoft Public Sector CIO summit in that village named Redmond “across the pond” from Seattle. It’s also a week of continuing rotten economic news for public and private sector alike. In this environment, it sure is tempting to chuck Microsoft’s Office and web products and their complicated Enterprise and Select Agreements in favor of open source equivalents.

But you know what, the City of Seattle is not going to do that. Why?

Regular readers of this blog – if there are any – know I’m from Seattle and most of you know I’m a serious supporter of Microsoft software and products.

Clearly, I’m prejudiced.

Microsoft provides 40,000 jobs in my area, we have hundreds of thousands of shareholders (many of whom are also constituents) living here. We benefit from the tremendous wealth which has flowed from the around the world into Puget Sound to literally thousands of people, institutions and non-profits in the region. That wealth flows elsewhere, of course, too. The Bill and Melinda Gates Foundation is doing wonderful things for schools and libraries across the nation and around the world. Microsoft research and technology centers are at many locations outside of Puget Sound – indeed there are about 50,000 Microsoft jobs OUTSIDE of Washington State.

On the other hand, all governments have budget pains. I got my first official budget cut memo three weeks ago (we’ve been doing actual budget cutting for at least 9 months). In the past I’ve had to lay off people based almost solely on seniority (or, rather,”juniority”). And I’ll undoubtedly be doing it again at some time in the future, if my job isn’t cut first!

Microsoft’s licensing costs are a large part of our budget, as are the maintenance and licensing costs we pay to Oracle, IBM, ESRI, and many other vendors. We do need to examine alternatives and options.

But I’m somewhat baffled that any CIO of a large government would seriously consider using open source software for our mission critical systems and services. This seems a little bit like using cell phones to dispatch police officers and firefighters or outsourcing your help desk to India. It will save money in the short term and work pretty well “most” of the time …

What is the advantage of using software from Microsoft – or Oracle, or ESRI, or Peoplesoft, or Hansen or … any major software vendor?

No business large or small would seriously consider writing its own financial management system, even though, with web services, database software and a spreadsheet program we could probably do it. We could probably cobble together a computer-aided-dispatch system or work management system from similar components.

The advantage of off-the-shelf or “shrink wrap” is that it is pre-written for us, the bugs are fixed, the upgrades are provided and – of increasing importance – security issues are handled and addressed.

Sure, you’ll say, Microsoft software is really prone to security flaws and attacks. Why is that? Because it is the most popular and ubiquitous software in the world! Its logical that any software which reaches significant market share will become a target for teams of hackers employed by terrorist-nation-states and crime syndicates. And the software for open source is on the web and freely available for such hackers to view!

Now, I understand that open source is supported by a developer community, and that’s good. But this developer community is nebulous. It is a difficult place to find when something serious goes wrong. Governments now rely heavily upon technology to provide critical services and interact with constituents. CIOs are responsible to elected officials keep that technology reliable and available. To depend upon an amorphous “community” of developers with no direct stake in your mission is a risky proposition.

Few businesses -other than local governments – have technology systems so important that people’s lives are actually in jeopardy when those systems fail.’ Sorry, I don’t want a “nebulous” community supporting my public safety and utility system.

Next, in an open source world, what do we do about application integration? Gee, almost every vendor writes their software to work with Microsoft Office, Exchange/Outlook and similar products. Even hardware vendors such as Nortel or Avaya or Motorola will make sure their hardware/software integrates with Microsoft. If there is an issue with the way PeopleSoft HRIS or Government Financials works with e-mail software or office software, they will always fix the Microsoft integration first. When a hot new product comes out – like BlackBerrys – the vendor will make sure it works with Microsoft software right out-of-the-gate.

Believe me, I know this first hand, since the City of Seattle was (still is) a GroupWise e-mail user. I had department directors knocking down my door to get BlackBerrys but the GroupWise version was released FOUR YEARS behind the Exchange/Outlook BlackBerry.

Furthermore, many of our applications now vitally depend upon web services for their user interface. Most of those applications vendors will not be officially supporting open source versions of web services anytime soon.

So, if we – government CIOs – move to using open source software, how do we handle the support and integration?

Answer: like everything else, we hire smart people. Highly proficient technical people who understand the bits and bytes of how this stuff works and can make it happen. Managers who can develop networks of people in other jurisdictions and in the open source community to fix the bugs, get the new releases and work with the integration. Skilled “open source” employees who are dedicated to our mission of “making technology” work for our government and the people we serve.

Well, where is our budget pressure? Yes, it is in revenues and budget dollars. But it is also in FTE – headcount. How many times have each of us been told to reduce headcount? What is the one number (again, besides raw dollars) which newspapers, the public and elected officials always watch and measure? It is “Number of Government Employees”. There is constant pressure – even in good times – to hold the line on headcount, if not actually reduce it.

And when we do reduce headcount, what positions are cut and who is laid off? It is always the last hired, which are usually the youngest, tech-saavy (at least on new software or open-source software), most connected employees.

With open source not only will we have to increase headcount, we’ll become vitally depend upon those new hires and that additional headcount to make our most critical and important applications work.

By making us MORE reliant on headcount and FTE, I think a move to open source software actually exacerbates our budget problems.

On the other hand, elected officials and those with budget oversight are much more likely to accept payments to our software and hardware maintenance vendors as necessary requirements. They all have personal experience with technology, if only their cell phone and desktop computers. They all understand the need to maintain cars and buildings and computers.

But how much of our core and critical work can really be “crowd sourced”? Do we really want to open-source computer-aided dispatch systems or records management systems which have personally identifiable data or arrest/911 call information? And I’m very nervous about open sourcing any part of SCADA (utility control), or traffic management or other control systems which are vital to our governments and targets for attack and compromise.

In these high-pressure, budget-constrained, headcount-hunting times, use of open source software appears to be a high-risk, low-return proposition at best, and a “government fails” newspaper headline at worst.

1 Comment

Filed under budget, economy, Microsoft, open source