Skip to content

Posts from the ‘Security’ Category


CIOs: Focusing on Obstacles will Limit your Success with Cloud Computing

No matter where you stand on the new Cloud technologies, there is no escaping the fact that Cloud Computing has everyone’s attention. For some business executives it is seen as an opportunity to financially restructure their IT expenditure. Some focus only on the risks they perceive of placing their data and systems in the hands of some external third party.  Still others see it as providing the means to focus on their core business and new business ideas without having to worry about whether the computer infrastructure will be able to cope.

While IT teams must ensure that systems are safe and data is secure, it is ironic that by focusing too much on security and availability, many CIOs are exposing themselves, and their employers, to a far greater risk – the risk of missing the opportunities presented by new technologies emerging from Cloud Computing.

The CIOs who will provide the greatest value to their employer will be those who approach Cloud by asking themselves “what can we now achieve that was previously inconceivable?”

While there are many distracting arguments about what constitutes Cloud, the key characteristic that differentiates it from more traditional approaches is that Cloud provides the freedom to be remarkable – the freedom for a business to focus on what it does best without constraints imposed by infrastructure.

Traditional approaches to IT see the acquisition of dedicated equipment on a project basis, with each new system requiring new equipment and administration. This leads to ever-increasing IT complexity, with the IT department working to prevent things from getting out of hand. In many cases this has led to a perception that the IT department is the problem, and many IT budgets are shifting to marketing as a result.  Under a Cloud model, IT should evolve to becoming a reservoir from which new equipment is instantly sourced, and a platform underpinning whatever the business or the marketplace throws at it, scaling up and down to meet changing demands. While traditional approaches add complexity, Cloud provides the freedom to focus on the business imperatives.

IT leaders who embrace Cloud computing as an enabler will not be seen as roadblocks by the marketing or sales departments. Especially when they adopt Open Cloud approaches such as OpenStack that overcome vendor lock-in and allow for hosting data on-premise, off-premise, or a mix of the two.

Cloud computing enables businesses to take advantage of the relationships their customers have with each other.  For the cost of a coffee, businesses are able to experiment with new technologies by renting computers for the few hours it might take to trial an idea.  They can continuously update their web presence in response to constantly changing patterns of behaviour. They can forge ahead with an initiative knowing that if it exceeds beyond their expectations, the platform can grow to accommodate it, and then shrink when the job is done. They can scale while maintaining a specialized relationship with each individual client. They can identify trends and make predictions based on analysing unprecedented amounts of data. Their employees can collaborate, find information and respond to events and customer demands with far greater agility than ever before. Those who truly adopt this approach fundamentally understand that the Cloud is independent of issues such as on-premise or off-premise – provisioning can include a mixture of both – even bare metal machines can be incorporated into a Cloud-oriented approach to provisioning.

CEOs need to understand that the opportunities to stand out have never been greater. They can help their businesses succeed in capitalizing by making it clear to their CIOs that it is no longer enough just to ensure that systems are operating and data is safe. Cloud computing opens up opportunities for a level playing field like never before and CEOs need to put their CIOs on notice that they need to be first to come up with the next wave of innovation or there will be more at risk than their jobs.


Theoretical Disaster Recovery doesn’t cut it.

I have mixed feelings about Amazon’s latest outage, which was caused by a cut in power. The outage was reported quickly and transparently. The information provided after the fault showed a beautifully designed system that would deal with any power loss inevitability.

In theory.

After reviewing the information provided I am left a little bewildered, wondering how such a beautifully designed system wasn’t put to the ultimate test? I mean, how hard can it be to rig a real production test that cuts the main power supply?

If you believe in your systems, and you must believe in your systems when you are providing Infrastructure As A Service, you should be prepared to run a real live test that tests every aspect of the stack. In the case of a power failure test, anything short of actually cutting the power in multiple stages that tests each line of defense is not a real test.

The lesson applies to all IT, indeed to all aspects of business really – that’s what market research is for. But back to IT. If a business isn’t doing real failover and disaster recovery testing that goes beyond ticking the boxes to actually carrying out conceivable scenarios, who are they trying to kid?

Many years ago I had set up a Novell network for a small business client and implemented a backup regime. One drive, let’s say E: had programs and the other, F:,  carried data. The system took a back up of F: drive every day and ignored the E drive. After all, there was no need to back up the programs and disk space was expensive at the time.

After a year I arranged to go to the site and do a back up audit and discovered that the person in charge of IT had swapped the drive letter around because he thought it made more sense. We had a year of backups of the program directories, and no data backups at all.

Here is the text from Amazon’s outage report:

At approximately 8:44PM PDT, there was a cable fault in the high voltage Utility power distribution system. Two Utility substations that feed the impacted Availability Zone went offline, causing the entire Availability Zone to fail over to generator power. All EC2 instances and EBS volumes successfully transferred to back-up generator power. At 8:53PM PDT, one of the generators overheated and powered off because of a defective cooling fan. At this point, the EC2 instances and EBS volumes supported by this generator failed over to their secondary back-up power (which is provided by a completely separate power distribution circuit complete with additional generator capacity). Unfortunately, one of the breakers on this particular back-up power distribution circuit was incorrectly configured to open at too low a power threshold and opened when the load transferred to this circuit. After this circuit breaker opened at 8:57PM PDT, the affected instances and volumes were left without primary, back-up, or secondary back-up power. Those customers with affected instances or volumes that were running in multi-Availability Zone configurations avoided meaningful disruption to their applications; however, those affected who were only running in this Availability Zone, had to wait until the power was restored to be fully functional.

Nice system in theory. I love what Amazon is doing, and I am impressed with how they handle these situations.

They say that what doesn’t kill you makes you stronger – here’s hoping we all learn something from this.


Information Security – A New Frontier?

Traditionally, companies have focused their security efforts on protecting internally managed, internally generated information from reaching unintended audiences. This includes unpublished financial performance, sensitive employee and customer data, intellectual property, current tenders in progress, business strategic plans and so forth.

At the same time, public information specialists have ensured that the company’s public face, its brand and reputation, are protected and enhanced. Never the twain shall meet.

IT, working in a vacuum, has increasingly espoused the philosophy of control and containment. The common wisdom is to manage what you can control, work within your sphere of influence – because things that happen outside your control are just that: outside your control.

The rise of social networking is changing this, but company IT departments are slow in recognizing this shift. Here is an illustration.

I was at a luncheon on Information Security hosted by PriceWaterhouseCoopers recently and was taken by a comment by one of the presenters. He said he was encouraged that in previous years we were talking about IT Security, now we were talking about Information Security, and he hoped that in future years we will have moved to talk about Information Risk.

This started me thinking, so I posed a question to the panel: We talk about protecting endogenous, or internally generated information a lot, but what about exogenous, externally generated information? This is the stuff that happens in the public domain – customers, the media, even employees to some extent talk about the company and its products in the public arena. This information pertains to our company, its products and services, but it is generated externally. I made the comment that in the past we could control to some extent this exogenous information, but today, with Twitter, Pinterest, Facebook, Youtube, Blogs etc, the public has a lot of leverage. I asked them for their thoughts on security over exogenous information in this new world.

Their response? They told me that companies need to think long and hard about allowing staff to access Facebook and Twitter at work.

It seems to me that PR and marketing people are a LONG way ahead of IT people when it comes to this type of information security. Blocking  access to staff to social media at work is like holding up an insect screen to stop a tsunami.

It is past time that IT managers broaden the scope of their security thinking and engaged with other areas of the business to form a coherent plan designed for the modern era.


Cloud Computing Is Like the Food Industry and Needs to Behave the Same Way

It was a big day today for cloud computing. At least it certainly seems that way after two of the most egregious issues witnessed in cloud computing service provisioning, both emerging in the past 24 hours. I feel compelled to make some comments in response to the events of Distribute.IT’s catastrophic and irrecoverable data loss and DropBox’s temporary lapse of control over their security system, allowing any user’s password to access any user’s account.

Let’s first of all recap what happened.

  • DropBox had an incident they described as an “Authentication Bug” resulting from a code update released that allowed any user to log into any account using any password. The security hole was open for almost four hours.
  • Distribute.IT, who provide web hosting services had what they described as a “deliberate, premeditated and targeted attack ” on their network. This impacted pretty much their entire business infrastructure and four major servers were irrecoverably lost including all “production data, key backups, snapshots and other information that would allow us to reconstruct these Servers from the remaining data.” The upshot: more than 4800 customer websites, data and email reservoirs eradicated.
I have to say in both cases I am rather stunned. Read more »

CIOs Moving to the Cloud: The buck still stops with you

Amazon Web Services has been going through a much publicised outage, which has lasted by all appearances more than 12 hours. A range of services including Hootsuite, Reddit, Heroku, Foursquare, Quora and others have all faced major disruptions.

What is interesting is how they have positioned these outages: many have said EC2 is great, but they are having a bit of a problem at the moment. It appears these providers are taking the view that “whew, glad we outsourced our stuff so that it is clear that is not OUR fault that something like this has happened, and we can point to other vendors to prove the case that it wasn’t us – just imagine if we had have done this on our own servers and this happened, we would have been much more at fault!”


Just because systems are moved to the cloud doesn’t mitigate the responsibility to ensure mission critical outages are mitigated. If a business has a use-case that cannot tolerate down time then that business needs to architect their solution in a way that prevents downtime. Cost tradeoffs are always an issue, but if something goes wrong, and the cost of that problem is too high, then perhaps the service isn’t really feasible.

Imagine an airline providing a service where they cut costs on safety in order to offer a cheap service… Doesn’t bear thinking about. Imagine that the airline outsourced their safety inspections to a third party and then wiped their hands of responsibility in the event of a “downtime”. No-one would buy that.

The whole point about the cloud is that it enables you to free your thinking about one provider. Even if you stick with an Amazon only solution, or a Microsoft or Google or Salesforce or Rackspace or whatever solution, you still need to architect things in a way that allows you to accept the consequences of any flaws, no matter how they are caused.

After all, you are the service provider to your customer base – how you decide to deliver that is up to you.

A lot of people are learning a very hard lesson at the moment – there are good ways and bad ways of doing things. For some, a 12 hour outage is hardly a problem, but for others it can ruin lives.


Security begins at home

I recently remembered a situation a year ago or so where I was called by a friend to assist them with a hard disk crash that resulted in them losing all of their client data, rendering the machine inoperative. This was a business based out of a home office. I won’t go into the nature of their business to avoid embarrassing them except to say that if their clients had wind of how close their data came to being lost it might have been bad for this small business operator. Really bad.

“Was there a backup?” I hear you ask. No there was no backup. Fortunately for the person involved and their clientele I was able to recover the hard drive. While I was doing this, I asked them if they had considered putting the data in the cloud and their response was ironic to say the least: “Oh I wouldn’t do that, the data is too important, I would be worried about the security.” And this said while I was working on the computer placed in front of an exposed window.

It got me thinking about how secure are our businesses when we expose our systems through our homes so badly?

Here are some things to think about when home meets office:

  1. What would happen if your home computer were stolen?
  2. Is your home computer connected to your office using automated VPN scripts? If so are you adequately passwording the machine?
  3. Do you have automated email clients that allow access to your email without you having to log in to view or respond to email?
  4. Do you have intellectual property on your home computer that would be bad if it fell into the wrong hands? Think source code, client lists, product plans, minutes of meetings etc
  5. Do you have children who connect to the computer and could share access with friends, install malware inadvertently?
  6. Are external people able to use your computer – babysitters, friends of family etc
  7. Do you discuss work with your family members? If so how clear are you that they are not sharing your news or company secrets with friends or posting comments on facebook or twitter?
  8. Do you have company backups at home that could fall into the wrong hands?
  9. Does your computer allow connections automatically to key systems like ERP, CRM, Project Management, Source Code Version Control,  Databases etc?

We tend to take a lot of care about our work environments, but it pays to be vigilant about the worst case when business meets the home environment.


The Privacy Membrane

People keep going on about Privacy when it comes to the cloud. Privacy: it is like a religion. “We must preserve privacy in everything we do”. If you think about this general view for a moment, it becomes clear very quickly that it is a superficial view without much substance. Managing privacy is about ensuring:

  • That we can get access to “stuff” we need, want and have a right to get access to, when we want or need it;
  • That we can prevent others from getting access to “stuff” we don’t want them to get when they have no right to it (or we have the right to prevent them from getting it)
  • That we can disseminate “stuff” (to which we have a right) to people (or systems) when we want to;
  • That we can prevent others from exposing us to “stuff” we don’t want to receive and we have a right to avoid.

An Internet search of ‘privacy taxonomy’ yields a lot of academic material on this topic, but I thought it would be worth conveying a few key points to get people thinking about the fact that information privacy is not just some black and white concept that applies without thought across the board.

The Privacy Membrane

The diagram to the right highlights that there are different types of information and how this fact applies to the privacy debate. Clearly, from an information producer’s, or custodian’s perspective some information we want to keep to ourselves, some information we want to share with the world. Likewise, from a consumer’s perspective, some information we really don’t want to receive, while other information we prize highly.

In the diagram, information flows in two directions – from us and to us. Some flows are desirable (green), while some flows are undesirable (red).

The diagram implies domestic and commercial use, but this is indicative only and can be applied in all permutations – domestic to domestic, domestic to commercial, commerical to commercial and commerical to domestic.

One interesting implication of the diagram is that the nature of information (in terms of its privacy) differs between the view of the entity with the information and the view of the potential recipient.

So let us examine each of these in turn. In the description given below, the term possession is used generically to imply either ownership or custodianship, and should be considered in the widest possible terms. Each type is examined in the order of the diagram starting at the top left, going down then across.

Type 1: Information we possess and don’t want others to possess.
This type of information is information for which we consider there is some sort of negative ramification for us if  others gain possession of the information. This can range from personally embarrassing/damning through to commercially damaging. Examples include:

  • An employee interviewing for another job;
  • Trade negotiations or terms;
  • Customer information such as credit card or phone numbers, health records, trading history, information garnered under legal professional privilege;
  • Nefarious or embarrassing activities such as infidelity, crime or doing something against the will of a parent, spouse or employer;
  • Details about a planned surprise.

In all of these cases there is some reason why the recipient would not want others to gain access to the information. Note that in some cases, the information’s privacy value is temporary. Others, not so much. The value of the information to others is not a factor, except to the extent that from the perspective of the possessor it would be damaging for the information to leak across the privacy membrane.

Sometimes the damage in this case is associated with the information itself – a villain gets hold of a credit card – and other times it is not the information per se, but rather the fact that some information, any information, has leaked is cause for a loss of trust in a custodian. For example if a bank, accountant or broker were to release details of customer balances, the results would be devastating – not necessarily for the customer, but certainly for the bank etc. An example of this happened this week with Vodafone customer information including names, numbers and credit card details being exposed on public websites, over which some employees have lost their jobs.

The degree of risk of information leaking in this category depends on many factors, including:

  • The perceived damage to the possessor of losing the particular information
  • The perceived damage to the possessor of losing information in general – this is very much dependent on the nature of possessor. For example a child leaking information is likely to suffer little damage compared to a major cloud provider.
  • The perceived value to a potential recipient of the information, the number of potential recipients, and whether the information is single use or has value to many people.

Type 2: Information we possess we want others to access

Once again there is a wide variety of information in this category and the perceived value to the (initial) possessor varies as well. Examples of information in this category include:

  • News possessed by a journalist or publication;
  • Details about an upcoming social occasion;
  • A new product announcement;
  • A limited-time offer, with or without steak knives;
  • Results of personal or corporate achievements to be shared for glory.

In these cases, the value again depends on the context and the timing. A News story is of value to a journalist if it is timely, and better still, uniquely obtained. Once it has been published by others, its value is often deprecated. Note the value to the possessor is in some ways independent of the value to the targeted recipient(s), but in some ways its value depends greatly on how it is perceived by them. A wedding invitation carries great value in many cases, while a “spam” email usually carries negative value.

Type 3: Information others possess we would like to possess (whether or not we have a right to it)

This is where privacy takes on a different nuance – how to gain possession of information. Information may be public information, such as the weather, a currency exchange rate or share price; or it might be private – either pertaining to us (to which we either have a right to such as a bank balance or medical or academic test result or we don’t have a right to, such as employer discussions about our future or details about who voted for usor gave us a positive review after a presentation) or pertaining to someone else.

Type 4: Information others possess we do not want to possess (not now, perhaps not ever)

Once again there is a wide variety of information in this category ranging from spam, where someone else wants us to possess the information, to clutter we deem irrelevant – the noise around us that distracts. Like with the other types of information, sometimes it is a question of context – we may want to possess information at another time, and increasingly, software systems, especially driven by cloud technologies, are allowing people intelligent context based on digital body language, trends, historic decisions and actions etc.


The management of the privacy membrane here is where software vendors and IT service providers earn their money. Allowing access to information, sometimes mission critical details, while preventing others from accessing that same information is the at the core of what the IT industry is all about.

The cloud will increasingly facilitate the crystallisation of these differences and provide us with increasingly  sharper focus on what matters to us. A greater understanding of the management of the privacy membrane, letting things through when necessary, preventing things from either direction when required, and transforming them, anonymising them, or merging them together from disparate sources as appropriate, will allow for a user experience that will seriously change the nature of the privacy debate in the immediate years to come.

Hopefully we can get past the religious view that blindly follows the mantra “all data is private and privacy must be preserved in all cases” to a view that facilitates the protection of information as and when appropriate (without compromising security), transforming information as and when appropriate (without compromising security or accuracy), facilitating the supply of information as and when appropriate, and provide some serious value to all this “stuff” in our possession.


Property Rights to Information in the Cloud – A Cloud based view on the Coase Theorem

When I studied economics  in the early 1980’s, we learned of the Coase Theorem, which always fascinated me. The Coase Theorem, is attributed to Ronald Coase, who has since earned the Nobel Prize for Economics (1991).

It occurred to me recently that the Coase Theorem may have some fascinating implications for the property rights of information stored in the Cloud.

The Coase Theorem, as I  recall it, goes like this: Regardless of who owns resources initially, given clearly defined property rights and zero transaction costs, resources will always be allocated most efficiently at the end of the day.

This makes for some really interesting discussions about the Internet and property rights to information. The theorem is particularly relevant for discussion for two reasons. Firstly, in the internet world, transaction costs asymptotically approach zero, meaning that the costs of transferring or asserting ownership of information is infinitessimally small, and getting lower all the time. Secondly, property rights are subject to a whole range of debates around privacy, rights to share, rights to mail, sovereignty, rights to access. So if property rights can be defined, the best allocation of resources, according to the theorem, can be ascertained.

For the first time, we have a situation where the theorem can be tested on a massive scale due to the low transactions costs being so low as to be unimagined when the theorem was first postulated. Economists are famous for proposing academic models, but here we have one that can actually play out in real life, where the focus is on the property rtights not the transaction costs.

So what does this imply – more research will be required on this I am sure, but initially there are some interesting trends emerging. We are seeing some stupendous valuations placed on the holders of the information we have. Facebook stands out as a particularly interesting case study because of the ownership debates and the sheer scale of data being pushed through that platform. Google is interesting because it can figure out what we are interested in and match that to marketers.

What does this say about the valuation of our personal data? Will a greater understanding of the Coase Theorem as it applies to the Web 2.0 put a value on our personal data? our spending patterns? There are already small examples of people receiving money for their data, their opinions, their search history, their web trails. Also, there are plenty of examples where people are paid in the forum of free software in exchange for the right to deliver advertising.

One thing is certain – we should not be giving up our rights to our data without fully understanding how valuable it is. The Coase theorem suggests that there is more value to than would appear on the surface and a little care should be exercised in the way we manage this intangible property.

I will have to think further on this.


Misplaced concerns about privacy in the Cloud?

Here’s a thought: Imagine needing a solution for processing diverse vendor bills or handwritten documents digitally with 100% accuracy. Imagine these come in continuously but without any idea of frequency. Obviously if you can provide some sort of API then others can hook into your system directly, but what if you are dealing with consumers who won’t use a computer? With Amazon’s Mechanical Turk you can programmatically assign these tasks to the public in a bidding system where you set the price of the request. You can make three independent requests for someone to enter the data into your database, compare the results for the three, and only if the three match do you consider the record processed. If one of them doesn’t match the other two you would go out with a new request and keep doing so until you get three that match. Any one who did not match would  be marked with a demerit and if they earn enough demerits you would block them from accepting future tasks. They would also be incentivated to do well because it would affect their public rating.

The cloud enables all sorts of variations of this model. It provides a means to connect low-paid service providers with companies who require tasks to be completed quickly and efficiently at very low cost. In essence it is similar to the microcredit schemes initiated by the Grameen Bank in Bangladesh and others in the sense that it opens up avenues of empowerment, but this potentially opens up opportunities for corporates to benefit as well. Incidentally, the founder of the Grameen Bank Muhammad Yunus won the Nobel Peach Prize for his work.

For many businesses this scenario is a frightening nightmare scenario – the encapsulation of the very things that prevent them from considering the cloud. And in many cases, this is simply not an option. But it creates an interesting thought experiment – how far can we go in the interest of efficiency to open our systems up to micro-outsourcing arrangements like this?

I suspect that over time scenarios like this will become more acceptable. Today though, I can’t see many people signing off on an implementation like this. If it were me, I would be looking to SOA models and trying to get suppliers into a B2B relationship. Years ago EDI would have been the way – if you wanted to be a supplier to one of the big department stores, you needed to hook into their systems. But this is a digression – the example postulated was about non-technical integrations.

But it begs the question about why we are so focused on concerns about privacy in the cloud to the exclusion of the benefits – sure, the above example opens a Pandora’s box of privacy concerns and would be almost universally rejected , but what about the normal, regular uses of the cloud? For most scenarios the lengths the major cloud service providers go to to ensure  data is accessible by only those who should see it should allay any fears – after all, typically, the big cloud providers have a lot more to lose if they leak  corporate data.

It is not the cloud vendors we should be fearful of, it is the way we choose to use their services; it is the way we choose to run our companies, it is the way we choose to view the world in which we live.


Pride in Equipment

CIOs and IT Managers of the past took pride in how many servers they managed. CIOs and IT Managers of the future will pride themselves in how few.

Recently one of my team members went to one of our regional offices to help rollout some new equipment and redo the IT infrastructure. What they discovered was chaos:

Notice the mess?

Here is the result after fixing it, with all cables color-coded, fiber optics built to connect everything etc:

Mess? What mess?

The point I am illustrating is that the server room, like everything else, needs maintenance and attention. It needs to be managed by professionals. Why would anyone want to do this when their core competency lies in making people laugh, making widgets, or fixing them?

I have been at conferences where people talk about the tribulations of  moving premises with a server room and how proud they are of their new server room and their team who set it up. But seriously, why should anyone be proud of their server equipment for its own sake? Yes, we should take pride in our work, yes we should  take pride in being professional, and we should be embarrassed if we see entropy bring a rack down to the example shown here. But our true source of pride should come from enabling our business to be the best it can be in delivering its core competency.

Unless servers are part of our core competency, More servers, less fun I say.

%d bloggers like this: