Monday, December 28, 2009

IT in Competition

Internal IT departments faced almost no competition in the mainframe era and most of the client/server era. This inevitably led to problems as competition was introduced and at least some backlash from their previously captive audience. This is not unique to IT people, but just look around to other examples such as the 1983 breakup of the AT&T monopoly or the introduction of significant foreign competition in the steel and automobile industries. Initial efforts attempted to put the world back to the "good old days", then moved on to casting as much fear, uncertainty and doubt on the newcomers and finally reducing prices to hold back the flood waters. None address the fundamental lack of skills and perspective required to survive.

Internal IT faces an expanding competitive threat from an increasing number of sources. These include:

  • Outsourcing - In other words, buying IT from another company. The economics of this are the most puzzling to grasp. Can I really buy the same thing I've been doing myself for less, and the outsourcer still can make a 15%-20% profit? Some of this can be accomplished by larger scale, but outsourcers live with competition every day and it simply makes them better at knowing their business, their costs and your contract.
  • Off-shoring - The economics are simple. We make a lot of money in the U.S. compared to most of the world, there are talented people out there and being 12,000 miles away just doesn't interfere all that much, and in certain circumstances, has a speed-to-market advantage.
  • Software-as-a-Service - I want it and I want it now. Sign on the line or input your credit card and you are off to the races. No worries about upgrades, fighting for capital or waiting in the IT queue-of-death. Compelling marketing messages targeted at the business user with the checkbook and the need to solve their business problem. Match made in heaven.
  • Cloud Computing - A new source of competition that's currently in the fear, uncertainty and doubt phase. But the economic case for non-production servers (about 60% of the total), short-term needs and spiking workloads are very clear and compelling. But perhaps the scariest part is that all these virtual servers will pretty much all look the same and can take advantage of continuous hardware price reductions. That Amazon small instance at 8.5 cents per hour today is likely to cost about 1 cent per hour in 5 years. Simple Moore's Law.

Can your IT department survive, and even thrive, in this competitive landscape? Yes, but not without a significant improvement in your business skills. All else being the same, you have several advantages, including:

  • You can see your budget and all the line item detail
  • You don't have a profit-margin to obtain and retain
  • You should have better insight into your companies' priorities
  • You are in a position to take more risk than an outsider
  • Your company knows you and you're readily available

To leverage your advantages you need to get involved, learn more about your business, stop protecting marginal jobs and embrace the changing technology options. Develop a trust relationship with business decision-makers and deliver on your promises. Be an advocate of your companies' change efforts and get involved. In the end it's all about business.






Monday, November 23, 2009

Password Craziness

There is a light at the end of the password tunnel. The only question is when will the endless craziness of longer and more complex passwords finally be tamed, for surely, either by reason or futility, it will end.


Surely you've seen the current craze, eight character passwords containing a combination of lowercase, uppercase, numbers and special characters. Let's say for the sake of argument that this is truly needed and worth every bit of aggravation. How long will it last? The basic math says about 10 years, given that Moore's Law holds and computing gets one-half as expensive every eighteen months, and that there are about 80 possible characters to choose from when building a password. To keep the same relative immunity, in 10 years it will take a 9 character password, in 20 years a 10 character passwords, etc., until such time that users revolt, or hopefully, start to question why in this world of marvelous technological innovation they must increasingly carry the security burden.


But why wait until the fires are burning around your feet and the smoke is rising to take a fresh look at the problem and solve it sooner rather than later. A few things to consider.


  • It appears that the single biggest issue is using hashes to store passwords. Then if the bad guys get the hashes, it's straight-forward to crack common passwords. If this is indeed a real problem, then fix it. Use something else. Like encryption. Duh.
  • Passwords can be cracked by brute force by simply trying every possible combination. This assumes that no prevention mechanism is in place to stop this tack from being successful. Since most passwords are validated by servers, this limits the number of attempts per second to the speed of the server and the intervening network, in most cases limiting the attempts to hundreds or maybe a thousand attempts per second. Compared to the over 1 quadrillion possibilities of an 80-choice, 8-character-long password, the math says it takes over 3 million days to try them all. I'll be lucky to live 30,000 days. I'll take my chances.
  • Passwords that are easy to remember are easy to guess, probably taking only a few thousand attempts via a "smart force" method. Very true, and assuming that the above server just blindly tries as fast as its little Ghz will allow, a very real threat. But since humans can't try more than once every few seconds and will undoubtedly give up after a dozen or so attempts and go find that sticky note they knew they would need someday. So why can't the server just let the user try a few times and revoke their account? That actually works well unless someone, inside or outside your company's "four walls", decides to enter your userid and a few bad passwords and lock you out of your computer. It happens, trust me on this one. The best solution is for the server to simply "slow down", ever more slowly processing new attempts. The hacker can't try any more than the user will try before giving up. This is the clever method used by Lotus Notes for years. If you're lucky enough to have Lotus Notes available, try a few bad passwords and see what happens. I promise it won't hurt.
  • If the hackers know all the common passwords, why do systems allow any of them to be used? Ah, the simple questions are the best, aren't they? If "egbdflth" is not in the list, wouldn't it be just as good as "Eg^-3U8i"?
  • Passwords that protect files are prone to hacking, since many copies of the file can be made and large numbers of computers can simultaneously try to guess the password. Sooner or later they will get in, and that time can be greatly lengthened by stronger passwords. Ah, we've uncovered a truly good use for strong passwords. Finally. Anyone out there do this on a regular basis?

Why force unreasonable passwords on people? In my book if it's not in the hacker's list, it's good enough. I believe that's the reason behind the lowercase, uppercase, number and special character craziness. They're just trying to get you to pick a password that's not very likely to be in the hacker's list. But that makes the password much harder to remember than it really needs to be. That in turn leads to extremely weak password reset tools which "challenge" you with questions like your mother's maiden name. Wow, that's really secure. Not.

The larger strategic challenge is designing solutions that are simple at the edge and not just passwords. Simplicity is the best security and removing the human element is essential to that design. Stop requiring users to solve our security problems and start looking for solutions that address the real problems.

Monday, November 16, 2009

My Home PCs – Part 4 – Toys for Geeks

The final installment of this four-part blog contains some utilities that most home users will never need, but I find them indispensable. With the exception of WinDirStat, these toys take a reasonable amount of technical knowledge to use, although they are unlikely to cause your PC any problems if you want to give them a whirl. If nothing else, it's interesting to run Wireshark and Process Monitor to see the sheer volume of what's going on inside your PC. It's a much busier beast than you probably think.


  • Wireshark - This program captures all network data packets coming into and going out of your PC, very similar to the professional Sniffer tool. Although having a network background is useful to understand all the packet headers, it's more useful to understand how an application works to make the best use of the data captured. It's a good idea to shutdown as many applications as possible before running Wireshark to reduce the data being captured. You can download Wireshark at http://www.wireshark.org and there are some very good introductory videos and other documentation at http://www.wireshark.org/docs. You'll also be installing WinPCap, included in the Wireshark download, which is the component that interfaces between Windows and Wireshark.
  • Process Monitor - This is one of many sysinternals utilities that Microsoft provides and the one I find the most useful. It shows real-time file system, registry and process activity, in short, all the stuff that's happening inside your PC at a very detailed level. The tool provides filters to reduce the flood of data it produces to a more manageable level. The download is available at http://technet.microsoft.com/en-us/sysinternals/bb842062.aspx, which includes both individual links to the different tools and a single download if you want the entire suite.
  • VirtualBox - For those of us that like to try out new operating systems such as Ubuntu Linux and Google Android and want to make it painless, VirtualBox is the answer, and can be found at http://www.virtualbox.org/wiki/Downloads . This Sun product comes in two versions. VirtualBox OSE (Open Source Edition) is free for all purposes and VirtualBox is free only for personal use and product evaluation. More details can be found at http://www.virtualbox.org/wiki/Editions. Virtualbox creates a virtual environment for its guested operating systems and boots up image files in the .iso format. It also handles virtual machines packaged in the Open Virtualization Format (.ovf).
  • Google Calendar Sync - In today's world of technology we have a lot of duplicate tools, one for our work life and one for our home life. But having separate tools sometimes causes issues and in my world having two calendars was particularly painful. Enter Google's free Calendar Sync tool, which can sync an Outlook calendar to a Google Calendar. I have my normal Google calendar that comes with my personal GMail account, which is my home life calendar. I have another Google Calendar, using a different account, which contains a synchronized copy of my work life calendar. I setup this second account to be viewable by my home life account and I can view both my calendars at the same time, giving me a complete view of my life. And my wife does the same, shares both of her calendars with me (and vice-versa) and I can see our combined four calendars all at the same time.
  • WinDirStat - We all seem to run out of hard drive space and finding good candidates to delete or move elsewhere can be tedious. WinDirStat solves that by scanning a hard drive and building a visual, color-coded "block map" of every file where the size of each block is proportional to its size. Click on the block and that file is highlighted and its directory structure displayed. By far the easiest way to clean up a hard drive I've found. This utility can be downloaded at http://sourceforge.net/projects/windirstat.

Monday, November 9, 2009

My Home PCs – Part 3 – Media and More Media

The first two parts of this four-part blog covered the not-optional portions of most home PCs, since handling email and feeding your web-browsing habit is probably the reason you bought a PC in the first place. But a PC is a computer and not just an average piece of furniture. And the best thing about owning your own computer is that it can run other software. In my case, most of that software handles various media, from music and pictures to video and audio. This list contains the ones I use most often.


  • Apple's iTunes - A must for those of us that have an iPhone or an iPod Touch, although Songbird (see below) is a strong contender for basic iPod music players. iTunes allows you to buy, rent or download music, videos, movies, TV shows, 100,000+ applications, and my favorite, podcasts. My personal podcast favorites (all free of course) include GeekBriefTV, The Jazz Suite, Dilbert, The Welch Way and TikiBar TV. You can find iTunes at http://www.apple.com/itunes. You'll also get QuickTime in the download (see blog part 2 for more info) and Bonjour, which allows various Apple devices to find each other without a lot of messy configuration. You might also pick up Safari, Apple's very capable web browser, although it's #3 in my book. See blog part 2 for better choices.
  • Mozilla's Songbird - An open source media player that can be downloaded from http://www.getsongbird.com. The one major drawback to iTunes is needing to build playlists in order to download specific music to an iPod. And if you have a lot of music and you ever need to rebuild all your playlists in iTunes, you'll find that's a huge hassle. Songbird can sync music-based albums or artists without needing to predefine anything. iTunes and Songbird can co-exist on your PC, but only one of them can control an iPod at a time.
  • Audacity - Another open source favorite, Audacity can record and edit audio. I've used it to create ring tones for my cell phone by clipping a few seconds from an audio file, and to build compilations of various components into a single audio file. Audacity is located at http://audacity.sourceforge.net. SourceForge is a treasure trove of open source software and worth searching when looking for free software.
  • Google's Picasa - Simply a great, free photo organizer that can import pictures from a scanner or digital camera. Simple to post photos to their free web service, order prints, group photos into albums, and their latest feature, facial recognition. Picasa includes some basic photo editing capabilities, but the three I use most frequently are cropping, lightening and straightening. Cropping allows you to just include a portion of a picture, lightening is great to fix those "too dark" pictures and straightening allows you to rotate a picture slightly, for example, to make that door in the background appear perfectly vertical. You can find Picasa at http://picasa.google.com.
  • AutoStitch - Sometimes you just can't get the scene you want in one picture and viewing several pictures just doesn't work for you. AutoStitch takes a series of pictures and "stitches" them together to form one panoramic picture. The result will need to be cropped (see Picasa above), but it's truly magical what this program produces. To find the download, enter "AutoStitch" into your favorite search engine.
  • Skype - The most popular free Internet voice and video service, it now handles about one-half of all international voice minutes. Calls between PCs are free and very popular in situations where friends and family are at a distant college or on an oversea assignment. My personal favorite is to access audio conference calls, since these are normally "800" number calls (free), avoid using my cell phone minutes and I can use a $10 headset which frees up my hands. Skype even lets you know when you start talking into your muted phone. You can call any phone for a per-minute fee if needed. Call quality on Skype is stunning, just like those "you could hear a pin drop" commercials from years ago. Also useful to call places that cell phone companies typically block to avoid fraud (in my case the U.S. Virgin Islands). The software is available at http://www.skype.com.
  • Google Earth - You can get lost for days (in front of your PC) traveling around the world exploring the canals of Venice, the streets of Paris, the Pebble Beach golf course or the neighborhood where you grew up. Download from http://earth.google.com.

One caution - some of these applications, and other web-based video services, might just require a newer computer to run properly. We're not talking a $1,000+ high-end machine however. Most new machines will run just fine. And if you find yourself in the market, buy an extra couple gigabytes of RAM. You can send me a thank-you note in a couple years.

Sunday, November 1, 2009

My Home PCs – Part 2 – Office and Web Must-Have’s

Having completed the five-step PC protection plan detailed in Part 1, it’s time to add some software that will make using your email system and web browser a more complete and pleasant experience. We’ll start with handling the most popular email attachments: documents, spreadsheets, presentations, pdf and zip files, all without costing you a dime. Every program listed below are 100% free for both commercial and non-commercial use.


  • OpenOffice is found at http://www.openoffice.org. This is an excellent option if you need an office suite, which many people do, particularly if you have school-age children. OpenOffice can open and save files in its native formats and in Microsoft’s formats. OpenOffice does a nice job opening Microsoft documents and spreadsheets, but struggles with some PowerPoint presentations. It’s a good idea to upgrade your Java Runtime (see below) before installing OpenOffice. An alternative to OpenOffice is Lotus Symphony from IBM, available at http://symphony.lotus.com/software/lotus/symphony/home.nsf/home. Based on OpenOffice, it's limited to documents, spreadsheets and presentations, whereas OpenOffice includes database, drawing and math programs.
  • Microsoft’s free viewers for Word documents, Excel spreadsheets and PowerPoint presentations. These viewers allow, as you might suspect, you to view attachments you receive, but does not allow you to change them. They do an excellent job of displaying without losing any of the formatting contained in the original file. To find these, go to http://www.microsoft.com/downloads and search on the word "viewer".
  • Microsoft’s Office Compatibility Packs. These allow Office 2007 files (docx, xlsx, pptx) to be converted to Office 2003 formats and used with your older Office suite. There may be some loss when features only in available in Office 2007 were used, but in my experience that's not been a problem.
  • Adobe’s Acrobat Reader from http://www.adobe.com/downloads handles viewing ".pdf" files. Your PC probably has an old version on it that works, but keep this current to catch all the newest features and stop some of the most recent hacker attacks.
  • 7-Zip handles ".zip" files, a popular format for combining and shrinking multiple files into a single file. It also handles a number of other compressed file formats popular on non-Windows systems.

Your web browser will run across a variety of file types, and having the following products installed will enable you to see and listen to the most commonly encountered.


  • Sun’s Java Runtime Environment (JRE), which is needed to run programs that get downloaded from some web sites. It can be found at http://java.com/en/download.

The final four are needed to play the most popular video formats. Silverlight is the new kid on the block, so that one might not be familiar. QuickTime is also packaged with iTunes, which will be covered in Part 3 of this blog, so don't bother installing it here if you plan to use iTunes to manage your music.

  • Adobe’s Flash Player can be found at http://get.adobe.com/flashplayer.
  • Adobe's Shockwave Player can be found at http://get.adobe.com/shockwave.
  • Apple’s QuickTime can be found at http://www.apple.com/quicktime/download.
  • Microsoft’s Silverlight can be found at http://www.microsoft.com/silverlight.

Friday, October 23, 2009

My Home PCs – Part 1 – The Basics

A number of years ago I was messing around with partitioning my hard drive and managed to completely destroy my home PC. As I was trying to remember all the software I needed to install again, I decided to take the opportunity to document them all, figuring I would do something else stupid again in the future. That started the list, I shared it with a few folks, added some of their suggestions and it grew into a nice reference. I’ve also added some good tips and techniques to protect a PC from the bad guys. Several folks, after rebuilding their kid’s PCs every few months, adopted these suggestions and all the feedback I get indicates it’s pretty solid. I decided it might be of interest to others, and hence I begin this multi-part blog, broken down into collections of related components. Part 1 starts with the basics, those things I recommend be done before you wonder off to email-land or surfing the web.

  • Apply all maintenance via the Windows Update facility, including all optional maintenance. The trick here is to run it, run it again, run it again, etc. until Windows Update is totally out of new patches. If you’re rebuilding a PC from a few years ago, this can take hours. Do it anyway. The bad guys really like taking advantage of PCs that are not patched. And make sure you have Windows Update automatically check every day and apply anything it finds. Yeah, it’s a pain to wake up in the morning and wait 5-10 minutes as you login. Do it anyway. And make sure to check Windows Update manually from time-to-time, and apply anything it finds.
  • Use a hardware firewall and never plug your PC into the box provided by your Internet Service Provider (ISP). Never count on the firewall running on your PC. Spend the $50. This will keep out those hackers that live anywhere in the world. You probably want the wireless access anyway, and these devices typically have a built-in firewall. I use a Linksys WRT54G, although several other models from D-Link and others work fine. If you have wireless access, use WPA2 if possible and follow the manufacturer’s instructions carefully. This is will keep out those hackers that live in your neighborhood.
  • The most effective method of protecting your home PC is to create a new account as a “Limited Account”. You can use Control Panel ... User Accounts to create the new account with limited privileges, leaving the account that came with the machine as the administrator. This, more than anything else you can do, will prevent viruses, spyware and adware from infecting your computer. The downside is that most software installs will have to be performed from the administrative account. This inconvenience is well worth it.
  • Load the free AVG anti-virus/anti-spyware software. This is licensed for non-commercial use only, one of the few software products I’ll recommend that can’t be used for business. It takes a bit to get through the various up-sell options presented, but starting at http://free.grisoft.com help a bit. AVG will automatically keep its virus signatures up-to-date, but watch for the occasional new version.
  • Since your web browser is always poking around the Internet, having a good one is very important. I use Mozilla’s Firefox, which is found at http://www.mozilla.com. Mozilla does a good job of quickly fixing problems and pushing fixes out automatically. It also does not support Microsoft’s ActiveX, a popular attack point for the bad guys. Firefox is fast and has lots of features, and a whole bunch more via add-ons. The only downside to Firefox is that it tends to consume more and more memory the longer it runs, which means you’ll have to occasionally shut it down and restart it. Google’s Chrome browser, found at http://www.google.com/chrome, is also very nice, has a great security model and is faster than blazes, particularly for JavaScript-heavy web sites like Google’s Gmail. I’ll move to Chrome exclusively when they have better tab support. I just can’t live without the Tab Mix Plus add-on to Firefox. Both these products, and most other browsers I’ve run across, are free for all purposes.
  • Use the OpenDNS service and configure it to block categories of web sites that you do not want your household accessing. I would suggest starting with "Moderate" level, but you can select just the categories you want to block. If you have the typical home ISP service, your IP address can change from time to time. Use their Dynamic IP support (see www.opendns.com for details) to insure your OpenDNS selections stay in effect when your IP address changes. This is somewhat technical and you might need your family’s Tech Support Person (there’s always a lucky one in the bunch).

Tuesday, August 25, 2009

Variety vs Uniformity

It is said that variety is the spice of life. It's also how consumers, in other words we as individuals, determine our own, unique approach to our personal productivity, including our choices of the technology we buy and how we use them. It reflects our individuality, the differences that make each us like no one else on the planet. I choose a Motorola RAZR v3, an Apple iPod Touch and a 19" HP laptop. I prefer powering my Griffen Evolve wireless speakers with Pandora. Your choices are undoubtedly different and reflect your values and preferences.

Business have a totally different approach to productivity. As much as possible they want standardization, from vacation policy to the email system. This drives costs down and enables everyone to have an equal playing field. So I have a choice of one laptop or desktop, one Blackberry, one cell phone provider, etc., although choice is limited mainly if I want one or not. This has worked well during the last twenty years when technology has mainly been used only between employees of their own companies. It's also worked well since the rate of technology change needed solely within a business has decreased significantly, so not only are we standard, we're also increasingly out of date.

As businesses have continued to increase leverage to gain further cost reductions, they have had to go outside their four walls to achieve it. This may be a acquisition or an outsourcing contract. This has the consequence of needing to accommodate a greater variety of technologies, which runs counter to cost-effective standardization they've accomplished, and they're not prepared to handle.

Is it any wonder that Internet-based services have begun to dominate the Communications and Collaboration space? You're a Mac, I'm a PC and we both can use that web-based conferencing service, although neither of us could use each others internal service. These services, born to handle the consumer's wide selection of choice, are perfectly positioned to win the heart and dollars of business users.

One way that consumers and business users are alike? They want it now. Game, set and match to the web. Not a fair fight anymore.

Thursday, August 20, 2009

70% / 30%

I heard a quote a few weeks back that took me in a far different direction than I'm sure the speaker had intended. "Only thirty percent of an IT budget is typically spent on new capabilities!". I'm sure this was intended to shock us, that so little was being done to help our business and it simply could not be tolerated. But as I dug deeper into this statistic, a very different picture emerged, and one that helps put many of the struggles IT deals with into perspective.

A 70%/30% ratio of maintenance dollars to new capability dollars, at its simplest, means that if your budget is constant, that next year's additional maintenance cost caused by the new capabilities being created is exactly equal to the improved productivity savings in supporting the existing capabilities. If the ratio stays constant, and the budget stays constant, your business can continue to add new capabilities year-in and year-out, forever.

Neat trick. What other segment of your business can boast so loudly. Could you have a house built, add 30% to it each year and have your maintenance bill stay the same? Of course not.

IT has two things going for it, both pulling to make the equation work.

First, Moore's Law has continued to make the hardware cheaper. Mainframe MIPs (Millions of Instructions per Second) were millions of dollars on the 1960's, reduced to thousands of dollars today. Voice calls have been reduced to less than 10% of their previous levels just a decade or so ago. Just a couple of the amazing changes that have occurred along the path of Moore's insight.

Second, IT continues to innovate, bringing a more connected world, fewer inefficiencies and enabling new business models. Without new innovations, the maintenance budget could stay constant, life would be boring and IT demoted to the least important of professions.

If IT can't keep the equation balanced, then something has to give. Either we no longer can afford the new innovations or we spend ever increasing amounts of money on IT. Neither scenario is good for the overall economy or the IT industry. Maintaining, or even improving, on the 70%/30% is an imperative.

So maintenance spend must continue to decrease. Increasing leverage (e.g. virtualization, cloud computing), using cheaper alternatives (e.g. web mail, open source, offshore programmers) and eliminating under-used or redundant services (e.g. portfolio management) are all levers that are being pulled in today's world.

When SAP announced their maintenance cost increase from 17% to 22%, a huge backlash resulted. Why? They dared to increase our maintenance costs, the very side of the IT equation we're working so hard to decrease. But most companies had to grin and bear that cost; the switching cost was far too high, at least in the short-term.

Microsoft, which for many years has delivered valuable new capabilities to our companies, now finds its cash cows predominantly on the maintenance side of the equation. New Windows operating systems and new Office suites are nice, but so what? The action is happening on the Internet, and how many new features can we really use? Unlike SAP, Microsoft's switching costs are relatively low and the alternatives are "good enough".

Given a choice, which one do you think has got to give? Not a trick question.

And the scenario will play out for every maintenance budget item. Do we need that? Is there a cheaper replacement? How can we avoid getting boxed in with vendors, particularly the ones we write the largest checks to? But understanding the underlying, fundamental forces at work here can turn this from a thankless chore to the most important work we can be involved in. We're keeping the innovation engine going strong!

Monday, June 22, 2009

Five Levels of Support

Those of us in the computer field are resigned to at least the occasional support role. If you're in the office next to the CIO, you might just want to consider it hazard duty.

But support comes in many flavors, as I'll attempt to explain these five level of support in more detail:

  • Support
  • support
  • Fsupport
  • Xsupport
  • FYBYOYO
Support refers to having complete ownership of the problem and responsibility to see that the problem is fixed in a timely manner. This is the level of support that most people recognize and are familiar and comfortable providing. The lines of responsibility is clear and they're in charge.

"support", or "little-s support", occurs when someone else, like it or not, has Support, but needs your help and assistance to resolve a problem. An application developer might need some help from a performance management expert or a network trace run to diagnose an issue. Care must be taken to avoid ending up with Support in these instances.

Fsupport, or "F-support", is familiar to those of us that provide computer support for Friends and Family, the "F" becoming clear. Since Friends certainly occur in the workplace, some confusion can arise when you offer "above and beyond" service for maybe no apparent reason. Maybe you're local expert that your colleagues would rather pull into a problem than call the Help Desk. This can become a balancing act and care must be taken to pick your Friends wisely.

Xsupport, or "X-support", is easily recognized by anyone that studies organization charts. At a certain level an eXecutive, the "X" becoming clear, gets support and loving attention regardless of corporate policy, their love of non-standard computing equipment and any resemblance of a cost/benefit analysis. Need to help their spouse with that new iMac at their summer home, no problem. Just be nice and all will pass by without a career threatening cloud over your head.

That leaves us with FYBYOYO, phonetically "fib-YO-YO", that stands for "Forget You Buddy, You're On Your Own", at least in polite company. This is probably the least familiar of the five levels of support, probably because it appears in real life with the same frequency as blue M&Ms. Whether it's our love of being heroes, racking up a few markers to be called in later, not wanting to rock the boat or a truly altruistic nature, FYBYOYO is a hard stance to take. But if you're tempted, make sure to you're clear about the first four levels. They might trump you more often than you think.

Monday, June 8, 2009

Losing Twenty Percent

Information technologists live in interesting times. The Internet continues to bring the platforms and services that mostly eliminate the problems inherent in the PC-based client/server era, such as large upfront capital investments, slow deployment of new or changed services and a daunting set of security issues rooted in its birth as a non-shared technology (i.e. DOS PCs with a dinky floppy drive and molasses-slow modems) attempting to thrive in the interconnected world of email, web-based services and ever-increasing bandwidths. While using the Internet has its own scale issues, it's clear to everyone from the board room to the kid's bedroom that PC knowledge is moving down in value, and knowledge of the Internet is moving up.

Dealing with change is hardly unique to IT people, but perhaps it's the constant, year-in, year-out bombardment of change that is not as prevalent in other fields. The way I've described it to people I've been fortunate enough to work with is to set their expectation, and my own, that twenty percent of the value of their current skill set is lost every year. Just to stay even requires learning a new twenty percent each year. This obviously is an estimate, varies from person to person and is greater in some years and less in others. But overall it's not far off and the point is not precision, it's direction. Stay still for long and your risk of being obsolete gets higher and the effort to retrain to an entire new skill set becomes tougher.

A technology leader must work with each and every person to insure that this does not happen. They must provide the constant push required to change their people, and the education, projects and rewards to pull them forward. It's easier to let people stay in a job and avoid the short-term productivity hit. It's easier to let someone convince you that they're happy where they are and avoid the whining. But it's harder to change attitudes baked in over years of stagnation and retrain, for example, a mainframe COBOL programmer in Java, C++ or Ruby. If they're lucky enough to time their retirement to their obsolescence, they're in the fortunate few. Most take jobs in less demanding parts of IT or leave the field entirely.

One of the most effective methods of gaining people's attention is to purposely, publicly and continually eliminate older technologies. That elimination may take the form of outsourcing the old mainframe systems, replacing dial-up modem banks with an Internet ISP or manual PC software installation with an automated system. This is, by far, not the only benefit you'll gain from this, but emphasizing the constant "out with the old, in with the new" mantra will send strong signals throughout the organization.

Change can be overwhelming to people, particularly ones that are large or arrive as surprises. The previous suggestions deal with the surprises, but what can be done to keep change in manageable chunks? Fortunately, that's where the twenty percent comes in handy. It's not that large if handled on a continual basis. But a totally new twenty percent is much more difficult than one that is somewhat familiar. Asking a programmer that knows several languages to add Ruby to that list is nowhere near as difficult as asking them to configure routers. Asking a network analyst to configure firewalls is probably better than asking them to learn a programming language.

While keeping within one's area and learning new skills is a good tactic, the people that are very valuable have skills in many areas, ready to pitch in on just about any project that comes around. Some people seem naturally born or gifted to play in all areas, or maybe they just get bored more easily. Regardless, your organization needs to move people around departments and the earlier they learn this the better.

Sunday, April 5, 2009

Applying Data Center Recovery Principles to PCs

PCs have become an operational lifeblood of most organizations and the amount of viruses, malware and zero-day attacks continues to increase. Very large security practices have been built up to address these problems and yet this trend surges forward unabated. Are we simply stuck with continually paying more to protect our systems or is there a point where PCs are treated in the same manner as the servers in the data center?

Large companies installed mainframes in the 1960s to automate back office processes, gain efficiencies and enable new and larger business models. It soon became apparent that the data center needed to be well protected. Guards, keypass entry cards, UPS power and fire suppression systems became the norm. But investments to keep the data center safe were not enough and the disaster recovery business was created to allow a company, at a fraction of the cost of running a duplicate data center, to recover their critical applications if the primary data center was unusable. You could never spend enough money to reduce the risk of losing the primary data center to zero.

Perhaps applying the same principles to recovering access to these same applications in the advent of a massive virus outbreak, power blackout or communications failure would provide a cost-effective solution. These applications might also be the same ones that employees need while working remotely.

There are many ways to architect and design a solution, but the least common denominator in today's world is the web browser. If you're fortunate enough to have all your applications web-enabled, then you have a huge head start. Perhaps a dual-boot option on your corporate PCs with a Linux/Firefox option is enough to get your users productive again. Another strategy would be to have employees use their home computers, almost ubiquitous now, as their backup device. A final option would be to re-stage each PC, although this may take more time to accomplish than the business might be able to tolerate.

For those not fortunate enough to be fully web-enabled, which includes most of us, a solution to access those applications needs to be available, but not require a huge investment in hardware and software. The advent of pay-as-you-go Cloud Computing and more robust Open Source software comes to the rescue. The idea is to build a ready-to-go desktop image in the Cloud (e.g. Amazon Web Services) using Linux, Firefox, native Linux applications and Windows applications under the Wine environment. This image would have the necessary VPN connectivity to your data center to access the back-end services. Each user would spin up a copy of the image, with proper authentication of course, and be back in business in minutes. Or perhaps leveraging open source virtualization software can allow multiple people to use one Cloud server concurrently.

This image might also be used for home or hotel access, and potentially avoid the extra costs of providing laptops by leveraging personal and hotel business-center PCs. A copy of this image that provides isolated access during your disaster recovery testing can significantly reduce that network effort. These are just a few of the possible uses for a solution architected in this manner.

Monday, March 23, 2009

Who Has A Bigger Problem

Your manager has just assigned you a difficult project or problem and you're at a loss to envision any possible solution. Is it time to hit the pervasive search engine and starting looking? Perhaps a call to a favorite vendor or perhaps an unknown one? An approach might be to stop looking for a solution and find someone that has a bigger problem than you do, preferably a much larger problem. Depending on the problem, you might even find that person inside your own company.

Odds are someone has a bigger problem than you do and looking outside your normal field of vision is often needed. Looking to reduce the cost of your email system? Try an organization such as a university, other large not-for-profit corporation or a particularly financially distressed company for ideas. Want to improve the integrity of your data center? Can you find a company where revenue dead stops when they're down. How about the stock market? If they make the Wall Street Journal headlines when problems occur, that's a good place to start.

Money, or lack thereof, is a good place to start hunting. Following the money trail is also useful. Are you looking for management support for a new security idea and the CIO isn't very receptive? Who else in your organization is rewarded when incidents are reduced or eliminated? With the advent of SOX compliance putting more people on the chopping block for deficiencies, perhaps Audit, Compliance or the CFO has that bigger problem.

Sales people are rarely a good source, simply because they sell a solution to your problem, not to their problem. Their problem is making their quota, and rightfully so. Helping solve your problem may be aligned at times, but in most cases they sell stuff, not solutions, and their reward is not directly tied to your problem being solved, but in getting the contract signed. Sales people can be a good conduit into making connections into the companies you want to investigate. A better way is to join and be active in one or two few large user groups or leverage a company subscription to The Hackett Group, Gartner Group or other advisory and benchmarking firms. Using LinkedIn, Plaxo or other web-based social networks can also lead to making the right contact.

The idea is not just to implement everything another company does, but to generate new insights into your specific problem. Perhaps one or two components of their solution is enough to satisfy your current needs. Stretching your mind in the direction of the problem, not an immediate solution, may be the key to your next "breakthrough".

Tuesday, March 10, 2009

Integration Architecture

When architecting a solution to integrate two systems, it's useful to realize that only three categories are possible. Applying the correct approach is crucial to avoiding extra coding, operational issues and higher costs. The three categories are:

  • Batch - A set of requests will be processed together at regular intervals. This is typically done to process requests more efficiently or during a window when resources are more available. This is a time-based mechanism.

  • Asynchronous - A response to a request is needed as soon as possible. The data will be delivered when the other side is ready and will never be lost. This is an event-based trigger mechanism.

  • Real-Time - A response to a request is required in real-time. In case the request is not fulfilled within a reasonable amount of time, the data will be discarded.

Batch interfaces have been around since the advent of mainframes and punch cards. A single method to handle batch data exchange should be used throughout the data center to simplify all operational aspects from security to recovery. One folder structure could be developed for all Production data and a second for Test data. These folders should be mountable to all systems, so multiple protocols (e.g. NFS, SMB) may be required. File system renaming is a useful practice to keep in-flight work from causing issues. As an example, perhaps we have a folder named Payroll. Within the Payroll folder, we create an Inbound, Ready, InProcess, Processed and Error folders. Data being created is put in the Inbound folder and when complete is renamed to put it in the Ready folder. A batch job runs looks in the Ready folder, renames it to put it in the InProcess folder and renames it again upon successful completion to the Processed folder or to the Error folder if unsuccessful. This simple example may not be enough for your requirements, so expand the concept with as many folders and subfolders needed.

Asynchronous interfaces are typically built upon a messaging queuing infrastructure using products such as Microsoft's MSMQ or IBM's MQSeries. Like Batch, you should establish only one method for this type of exchange. Asynchronous interfaces are typically used to process data that needs very quick turnaround, but the data can't be lost if the target system is unavailable.

There are a number of reasons that individual transactions can fail, special attention needs to be paid to build a notification system with the needed data available to take quick action to resolve the error. An approach to this is to make a copy of any transaction that fails and put it into a message queue where a program will read that data, send the appropriate notifications and make the data and error code available to the person performing the troubleshooting. A method of "replaying" the transaction by putting the transaction back on its original queue is useful to avoid manually performing each failed transaction.


Real-time interfaces come in lots of shapes and sizes and will vary from one vendor to the next. Standardization is coming slowly with the adoption of web standards, so you're likely stuck with supporting a variety of proprietary and open standards, fortunately in most cases with the aid of the vendor who knows their own choices well. Still, it will cause a great deal of operational support issues and these interfaces will tend to be your most critical. Put together a team to attack these issues before they become a business issue.

Your standards can be drawn using a generic Source System and Target System on either side of a diagram and your Real-Time, Asynchronous and Batch solutions connecting the two Systems. Each of the three paths describe the specific hardware and software that is standard in your environment. The simple diagram can then be expanded with the specific details for a particular set of Systems.

A few number of highly reusable components will speed the delivery of your interfaces at a greatly reduced cost, and most important, with the least amount of operational issues and business impact.

Saturday, February 28, 2009

A Four-Tiered Approach to Standards

Answering the simple question of what is your standard for a particular product, naming convention or password strength is often more involved that just a simple answer. The approach I use to set and communicate standards is a four-tier approach using Preferred, Standard, Non-Standard and Exception as categories.

Preferred is simply the product that I would like to use above all others. This could be for strategic reasons, advantageous licensing, skill base or any number of factors that make it rise above the rest. There is typically only one product signed with the Preferred tag.

Standard is the category for any remaining products that we offer internal support. Anything in the Preferred or Standard categories can be expected to be fully supported with multiple, skilled resources available and with defined Service Level Agreements. Anything rated below these two categories are a warning that issues will need to be overcome before using those products.

Non-standard is usually the catch-all category, naming common products that the internal staff does not have the skill base to support. The importance of this category is to inform decision-makers that additional costs need to be budgeted and that their support team will need to contract with others for support. Making this a dollars discussion usually drives the decision towards a Standard.

Exception means that I do not have the authority to allow this inside my company and that the decision-maker will need to have a discussion with someone higher up the organization chart. I’ve found Exception is better than No. No rarely works until that higher-up says No anyway. Exception says let me explain the situation, agree to disagree and let you know how you can press your case and the obstacles you may find. It turns an adversarial conversation into a useful, and professional, conversation.

Let’s make this more real with a totally fictional database example.

  • Preferred – Oracle – Company owns a site license, maintains a highly reliable database farm and all the staff is certified.
  • Standard – Microsoft SQLServer on Windows and IBM DB2 on AIX – Company has purchased a number of applications that did not offer Oracle as an option and at least three DBA’s are skilled in both.
  • Non-Standard – All others not specially listed including but not limited to Informix, MySQL and IBM DB2 on Windows. Use as an embedded database requires 100% vendor support.
  • Exception – Any mainframe database, since that platform is being decommissioned.
These standards can become quite involved and should explain as much of the “it depends” as possible. But there can be inter-dependencies between standards or special circumstances that make absolute statements impossible to craft. For example, maybe the vendor just started offering Oracle support but has a long track record on SQLServer. Judgments on these types of cases will be necessary.

Also recognize that your standards will change over time and will need to be communicated as they are modified. A good time to do that is a month or so before you get involved in the budget cycle. Maybe you’ll need some additional training dollars for a product you want to make Standard. Perhaps some outside contractor resources will be needed to cover a product demoted to Non-Standard. Or maybe a capital outlay for a new site license for that Preferred standard that is really taking off.

Saturday, February 21, 2009

Eliminate, Automate and Delegate

Sometimes just having a simple methodology for approaching your work helps provide focus and achieve better results. One of the approaches I commonly use is Eliminate, Automate, Delegate, and approach them in that order. This is no means rocket science, but I commonly run across efforts that fail to deliver the best results that could have used this approach.

Eliminate is by far the best result that can be obtained. Start by brainstorming ideas to completely eliminate the need for whatever you’re looking to improve. If it can’t completely done away with, can at least some portion of it go away? Twenty years ago I was involved in a project to access email via the telephone. The original approach was proving daunting and threatening to kill the project. A group got together and looked for a way to resolve the problem. The solution involved moving from a full-screen to a line-mode interface which eliminated seventy-five percent of the coding effort and made the service much more reliable. Considering an option that looked like going backwards (line-mode was so 1970’s) prove to be the key. The prototype was available a few days later.

Automate is replacing human effort with a non-human effort. Job scheduling is a common data center example and robots welding cars applies in the manufacturing world. How many web sites do you check out each day for information? Perhaps moving to an RSS reader, a form of automation that pulls in articles of interest, is a more efficient way to gather that information. Alerting is a common output of automation, only interrupting you when necessary. In this case, you’ve both automated the task and eliminated the need to check it out as often.

Delegate is taking the work done by a higher-paid person and shifting it to a lower-paid, but still qualified, person. Too often professionals spend a large amount of their time doing work at a grade level far below what they are paid. In some cases a person’s desire to perform a lower-valued task comes from their pride in building a solution from the start and its “their baby”. What the reason, good or bad, spending too much time performing lower-valued work will limit your time for new projects, ending with no more new “babies” to take pride in. Delegate takes a commitment to training, letting people make those mistakes (after all, you made yours along the way) and encouraging them. Think hard before you accept a “I’ll just do it myself” attitude.

Things you eliminate can no longer go wrong or waste money. Tasks that you automate usually cost a fraction of a human and it never gets tired or bored. Delegation creates valuable time for the highly skilled and develops new skills in others. But the key is to approach them in the proper order and get enough ideas generated to fully explore an opportunity.