Recently in Technology Category


In the quest for implementing the best-of-the-breed technologies, organisations often seem to lose sight of the real reason of implementing technologies. Some use it to maintain parity with the peer group while more advanced ones want to have a competitive edge over the others. Business Analytics is one such strategic initiative that aims to differentiate organisations by leveraging information assets. However, it is sad to see that most of the BI (or Business Analytics) initiatives have failed to deliver their intended results and consequently the technology investment is unable to deliver the business outcomes.

Interestingly, as IT fraternity we often seem to attribute this failure to technology per se and look for solution in the technology domain. However, a large part of the failure can be attributed to behavioural aspects of both business and IT.  In other words, the perception created by any system plays a critical role in defining its success. More so, in case of business analytics system, which is expected to have stronger sponsorship from business functions such as Marketing, Sales, Customer Support, Finance, Supply Chain and HR. Hence it is interesting to see the factors that influence the business perception about BI system in the post-implementation phase or adaptation phase. Quality is a universal parameter that influences the perception of every product or service. In case of BI system, which is meant to churn out a meaningful and contextual information for an effective decision making, two important aspects of quality, viz. the system quality and information quality, plays a vital role in shaping the business perception.

Information Quality: The notion of Information quality should assume a context-based view implying that it should be assessed as degree to, which it is helpful for the user to complete a particular task. This context view expands the dimension of Information quality beyond accuracy to include relevance, completeness and currency of information that shapes the user's perception of quality. Users may have different demands for currency and, as a consequence, information that is viewed as current for one task may be viewed as outdated for the other. For instance, if the aggregated sales figure of a particular product in a certain region is not updated for a particular period, the business user would find excuse for not using BI tool, although a latency of 2-3 days can hardly impact the quality of decision when the objective is to get into a diagnostic based analytics  i.e., to delve into the factors contributing to lower sales. Therefore, the CIO needs to have the business buy-in on the acceptable level of information latency without impacting the quality of business decision.

System Quality: System quality on the other hand reflects the information processing capability to produce the desired information. Thus, the dimensions of system quality represent user perceptions of interaction with the system over time. Prior research work on system quality suggests there are five dimensions of system quality - accessibility, reliability, flexibility, response time, and integration. Accessibility and reliability are, to a large extent, system dimensions. They represent defined properties that are largely independent of usage. Response time, flexibility, and integration are characteristics that are perhaps best evaluated in the context of specific tasks. The designing of right and easy to use interfaces for business, dimensioning of system parameters to have improved response time and ensuring at least four nines up time can have a significant impact on the user's perception. It is also important to note that any improvement in system quality will come at a cost and hence the business should be ready to do a trade off between the cost and these quality parameters. Indeed, with the introduction of new data management technologies and in-memory processing systems, the integration and response time issues are largely addressed.

In essence, the success of BI adaptation (post implementation phase) is largely about perception management of business on quality aspects of the system and information. The CIO in this context, has to play the role of a marketing professional to create a positive sentiments about the BI system that can help moving the system from adaptation phase to routinisation and diffusion phases where the real potential of the system can be exploited.

image courtesy: 

Dell, EMC and the Unifying Theory of the Universe

| No Comments
Michael Dell.jpg
Editor's Note: Today's blog may be called a departure from our core policy because not only is it written by a non-CIO but also directly by a Vendor and seems to be a Vendor story. But I think it is vital for us to highlight such views as it affects a large part of CIO ecosystem. The following text is being copied from Michael Dell's LinkedIn Pulse Post which he wrote last month after announcing the Dell-EMC merger - one of the IT industry's most ambitious, expensive and futuristic deals we've ever seen. While it's no endorsement of Dell's takeover of EMC, but the vision of Michael Dell and his thoughts on why this deal will bring a great positive chance, needs due attention. Very less people are capable of dreaming in such a way and even lesser are capable of achieving it. 


For those of you who haven't noticed, it's been a busy couple of weeks at Dell. On October 12th, we announced a definitive agreement to acquire EMC, including a controlling interest in VMware. It's incredible for me. I started this company 32 years ago, building PCs in my dorm room, and as I write this post, Dell is set to become an enterprise solutions powerhouse.

One week later we hosted about 5,000 customers and partners in Austin for our 5th annual Dell World. The timing couldn't have been better, because it gave us a chance to showcase our thinking about the future to our most important audience. I opened with a keynote humbly entitled the Unifying Theory of the Universe 1.0. In it I begin to paint a picture of the future, and the astonishing role that our combined company can play in helping customers achieve their dreams.

Dell and EMC are a highly complementary combination. We bring together the world's greatest franchises in the technology of today; leaders in Servers, Storage, Virtualisation and PCs. And we bring together the industry's leading technology of tomorrow; digital transformation, software-defined data center, converged infrastructure, hybrid cloud, mobile and security. Combined, we become a company with a leadership position in 22 Gartner Magic Quadrants. No one is more relevant or able to add more value.

We can deliver this unprecedented portfolio to the broadest possible global customer base. EMC has an unmatched reputation with large enterprises. At Dell we have unmatched strength in the mid-market and with small businesses. Together, we can fuel innovation across our combined portfolio and across customer segments.

Finally, we are bringing together an unbelievable set of capabilities. EMC is the best company in the industry at incubating new technologies, and that front-end innovation engine will be backed by Dell's best-in-class global supply chain. Think about the scale -- an $80+ billion revenue business. We will be the world's largest enterprise systems company, across suppliers, partners, R&D and innovation. To put it simply:  the best innovation with the highest quality and the greatest value for customers. All of it under a private structure, aligned from our customer-focused innovation to leadership and ownership.

According to a recent Fortune Magazine survey, 84% of CEOs believe it would be easier to manage a private company. Or as I put it at Dell world, "EMC - $67 billion. Being master of your own destiny - Priceless."

We also took the opportunity to reinforce our commitment to open partner ecosystems and our enduring relationships, like when CEO of Microsoft, @Satya Nadella, joined me on stage for a joint announcement on Hybrid Cloud and a candid discussion about the state of the industry.

I left Dell World, as I always do, totally energised. Our customers are changing the world through the power of their creativity and innovation. Together, we are going to build the world's infrastructure for the next 20, 30 years. Can you imagine what our customers' are going to accomplish: curing cancer, feeding and watering the world, creating jobs, hope and opportunity on a global scale. With Dell and EMC, we will have the vision, the innovation, the technology and the horsepower to dream that big. I can't wait.



Along the lines of my last post that discussed avoiding slogans and "lazy thinking" in IT, let's talk about the increasingly popular word "heuristic". I think we can all agree that developing software is anything but simplistic. So why aren't we more skeptical when people propose adopting simplistic heuristics for developing software? Let's look more closely at this manner of thinking, with a specific example.

In a recent exchange, a #NoEstimates advocate declared that one example of someone making a decision amidst uncertainty, without estimating, was the act of catching a fly ball. My response was that there are in fact many estimates involved in that activity, whereupon the #NoEstimates advocate put forth essentially the notion that a fielder uses the following heuristic instead:

"One good way to catch a fly ball is to hold up your gloved hand at a constant angle, and run so that the falling fly ball is directly aligned with your glove. If the fly ball appears above your glove, it is going to go over your head: move back. If it appears below your glove, it is going to fall in front of you: move forward. Left of glove: move left. Right: move right."

But really: watch any major league baseball game and look for a single instance, just one, where the outfielder is doing anything even vaguely resembling the above. (However, one can certainly conjure up the spectre of hapless Little Leaguers, coached in this #NoEstimates heuristic-driven technique, desperately waving their gloves back and forth in front of their faces, flinching while fly balls thud to the ground all around them). You won't find a real-world example of the above-described heuristic, because using just that heuristic will not lead you to predictable success in catching most fly balls.

In fact, you don't need to be a major leaguer to understand that there is a whole host of empirical observations and, yes, estimates, engaged in immediately and in an ongoing fashion by our model outfielder when confronted with "the crack of the bat". These activities are based on accumulated knowledge and experience, as well as evaluating current parameters. Here are just a few of these assessments/estimates, none of which can be dispensed with via the substitution of a heuristic:

  • Where am I right now: did I already "play in" towards the infield, based on my earlier estimate of where this batter typically hits the ball?
  • Is the ball headed my way? Do I start running immediately? Which direction?
  • What's the ball's speed, direction, spin, trajectory? There's no "one size fits all" here: left fielders know, for example, that balls hit to them by left-handed batters tend to behave differently from balls hit by righties.
  • Should I adjust for wind, thinking about a similar situation in the first inning, where a fly ball like this plummeted suddenly because of high winds?
  • Do I need to call off our hotshot center fielder who might run into me while I'm trying to catch the ball? Or does the center fielder maybe have a better chance of catching this ball than I do, and I should back off?
  • If I run as hard as I can to make the catch, will I run into the wall? (in essence, a cost/benefit evaluation)
  • My knee is still throbbing from yesterday's slide into second. Can I viably run at full speed?
  • Can I dive for the ball, or do I need to position myself for an immediate throw after I make the catch, to try to nail a runner who's off the bag or who will attempt to advance after the catch by "tagging up"?

Fly ball situations are complex, in other words. They're simply not solvable by a sole magic heuristic, and neither is software development. In complex situations requiring judgment and uncertainty, there aren't easy shortcuts or silver bullets. Judgment calls require, well, judgment. There aren't many ways (if any) to avoid putting yourself on the line. But there are most certainly ways of deluding yourself that you can escape accountability for judgment calls by using some kind of pseudo-automatic "paint by numbers" approach. It's once again the seductive lure of the oh-so-easy answer: the no-muss, no-fuss heuristic.

But a heuristic, at least in most cases, just isn't sufficient as a replacement for judgment amid complexity: you don't get something for nothing. Insistence on the efficacy of a heuristic can often serve as an example of what H.L. Mencken famously said: "there is always a well-known solution to every human problem -- neat, plausible, and wrong". 

#NoEstimates advocates, who I've observed quite often gravitate to selective redefinitions of basic words and concepts to support their arguments, might insist that the fielder trying to catch the ball is "assessing" various factors, not really "estimating". But that's mincing words: the fielder is most definitely amassing (even if implicitly, instinctively) a series of judgments and weighing possible reactions, based on both new data and historical knowledge of the situation, judgments that will figure into the many decisions that she has to begin executing immediately. She is constantly tuning and modifying those judgments as the play transpires: she may suddenly decide to dive for the ball, or, recognising that making the catch is unlikely, to pull back so the ball can drop in front of her and she can then be sure to hold the runner to a single. She is, in fact, estimating, in an uncertain, complex situation: juggling cost, benefit, impact, risk. Because, after all, that recurring pattern of assessment, decision, action, and adjustment is really all that estimating actually is.

Think about when someone says, "the outfielder misjudged the fly ball": no, it isn't the case that the outfielder somehow failed to apply a heuristic. Rather, the outfielder failed to estimate appropriately at the start, and/or failed to adjust her original estimates well enough to make the catch. And, do note that there's little chance of that outfielder making the catch at all if she decides that it's useless in general to engage in all those complicated acts of estimating I've enumerated, based on someone having told her that the heuristic is all she needs.

It should be clear by now: other than in the brevity of the time span involved, the process of catching a fly ball has numerous parallels to building software to achieve a functional goal:

  • In software development, you're a professional who's done this many hundreds or thousands of times before; faced with a new request, you make dozens of hourly judgment calls and trade-offs, as you explore, design, and start to build what's needed.
  • You decide, based on new data and historical knowledge and experience, what parts of the job are easy, what parts are risky, what parts will require extraordinary effort, etc.
  • As new information becomes available during the process, you adapt and change your approach as necessary, but your aim is still to catch the darned ball.

There may of course be situations in software development where heuristics can be of use. But when people insist that a heuristic will suffice to solve a tough problem, be at least skeptical. Consider if they're actually presenting a form of silver bullet as the answer for a complex situation.

In short, don't wave your glove at a fly ball just because someone has told you that it'd be low effort, low risk, or maybe even the One True Way to succeed.


This blog is reposted with permission from Peter Kretzman. To read Peter's Blog, you can visit:

Image courtesy:

India Catching Up Fast with Global IT: Gartner

| No Comments

closing the gap.jpeg

Gartner's recent release claims that the technology lag that once existed between India and other key geographies is now bridging faster. The "Gartner Hype Cycle for ICT in India, 2015" shows that more native vendors are entering emerging technology segments in India, covering areas such as crowdsourcing, disaster recovery as a service and the Internet of Things (IoT).


Many of the technologies on the 2015 Hype Cycle for ICT in India also appear in the global ICT Hype Cycle. When Gartner compared the technology entries for India with the rest of the world, the analyst firm noticed that the overall technology lag - that is, global traction versus India traction - is gradually closing.


Gartner claims that India has evolved from an IT environment that was about 18 months to two years behind global trends at the start of the decade, to one where most trends in India are in sync with global trends.


The Hype Cycle for ICT in India, 2015 identifies 26 key technologies that are most relevant for information technology in India, their maturity and traction, and positions them on the Gartner Hype Cycle. Technology vendors, service providers, stakeholders in IT, and business and government organisations can use this information to determine the maturity of each technology, as well as when each technology is expected to reach mainstream.


 On the Hype Cycle IoT is top of mind for many Indian enterprises.  Gartner believes that the IoT can benefit Indian enterprises in multiple ways, but for successful IoT implementation, Indian organisations will first have to understand the business use case for which they want to use IoT for. Success will also depend on aligning the IT and operation technology (OT) resources, processes and people carefully. Therefore, experimenting with pilot projects to understand the implications on people, process, technology and the business is an essential first step for Indian organisations.

Behind the IoT on the hype cycle, at the Innovation Trigger is , bimodal IT. It is a paradigm shift in how IT approaches its role within the enterprise. With technology becoming a more critical element of supporting business growth in a digital business era, it is imperative that IT adopt an approach that allows it to support the needs of the business at the pace of business change, which Gartner identifies as Mode 2. However, IT also needs to continue providing the traditional and foundational requirements that it has always supported, which Gartner lists as Mode 1. Supporting both of these vastly divergent activities is the fundamental premise of bimodal IT, and, done correctly, it can be a strong competitive differentiator for the enterprise.

DynamicCIO View: While Gartner's Hype Cycle and its predictions are quite close to reality, but the fact is that IoT still isn't implemented in many organisations not just due to the lack of business case but also because of other issues like network inconsistency, technology standardisation, and support etc.  

DevOps is still part of a few matured IT set ups rather than being a mainstream stuff. 

While most Indian enterprises are now making strides of different scales in Digital Business, their key considerations are still to reorganise IT optimally. 

Having said it, there is no doubt in the fact that Indian enterprises are quite capable of competing at a global stage now. There is a huge technology uptake in the new areas and true that many native producers are now coming to the fore to participate in it. 


image courtesy:


Quick Fixes through Tech-driven Business Disruptions

| No Comments

Renault KWID.jpg

While surfing through NDTV's Car & Bike section yesterday, I came across an interesting headline which read: Renault Kwid Receives Over 25,000 Bookings.  The car, that will compete in the entry-level segment in the price sensitive market, was launched only a couple of weeks back. It then intrigued me to know a bit more about this whopping number of bookings in the opening two weeks. This precisely gives a huge boost to the French Auto Major, which is passing through a rough patch in India.


There could be many reasons to the fantastic opening of QWID.


The first reason, which I assume is true, could be the price tag Rs. 2.56 (ex-showroom) is a phenomenally low price for a nicely done 54BHP, 800CC hatchback.


The second reason, yet another clear clincher, is 50,000 Km/2-year maintenance policy with complementary on-road assistance.


The third reason, which to me is the showstopper, is a technology-led disruption called Renault QUID Mobile App and the Virtual Showroom. Just ten days before the launch of the car, Renault unveiled this App on Google Play (see below).


Screen Shot Google Play App 2015-10-08 at 11.56.02 AM.png

Who could have imagined that a car maker like Renault will introduce a killer innovation that can not only change the perception of the new-age, digital buyers, but also create an extreme disruption that will rattle the market leaders. Without naming people here, I did hear the CMO of one of India's top car companies saying they have already placed requisition to their IT organization for creating a Mobile App for their top selling car(s).


Popularly, there are six kinds of popular innovation practices know to us i.e. Knowledge Channel Innovation, Peer-to-Peer Innovation, Outside-in Innovation, Technology-driven Innovation, Bottom-up Innovation, Top-down Innovation.


Among these, the technology-driven innovation has been quite a visible one and popular too specially for the companies that are not the part of the high-tech sector.


For companies that do not belong to or compete in the high-tech sectors, the development and deployment of technology has become a critical success factor these days. While traditionally automotive companies have used technology in sales force automation, building supply chains, connecting dealers, the entire assembly process and so on...a new era has ushered in where they need to think of disruptions in connecting with the potential buyers because technology is the only thing that can help them reach to customers faster in today's markets.


Renault did exactly the same. While I don't have comprehensive details of how this App works or what is the technology that integrates the OEM with dealers and how the bookings are processed, but I do know that this disruption has the potential to go viral sooner than later.


Using the App, you can book a Renault QWID in the following 3 easy steps:


1.    Download an open the App on the menu bar and you see a sign "Book Now"


2.    By clicking on the button, you are navigated to the booking page where you have to provide the following details:




Variant (drop -down list)





3.    As soon as you fill the above details it sends you a One-time Password (OTP). On entering the OTP, it takes you to the payment gateway ( make a payment of Rs. 5,000 and you are done!


Who thought car booking can be so easy?


Even before booking, if you want to take a virtual tour of the car, you can use the Renault KWID Virtual Showroom to get a 360degree view of the car, its specs and everything else that you want to know to buy a small car. (see images below)





The reason why I blew up a small innovation is just because I found it quite disruptive. Going through the trauma of booking a car while visiting the showroom is simply replaced by a delightful, awesome experience. Sure, it may have some inherent bugs/issues but that's not a big deal!


Another observation: Innovation isn't big or small. It is simply about the timeliness and how it impacts the business.

Behind the Clicks: Three E-Commerce CTOs Share Their Challenges

| No Comments

eCommernce CTO Challenge.jpg

If there is one segment in India that has caught the fancy of investors, entrepreneurs and customers in a phantasmagorical display of consumerist abundance unlike any other in recent times, it is e-commerce. On the one hand, the sector has its naysayers (the latest being former Network18 chief Raghav Bahl, who has said the bubble will burst soon) and, on the other side, there are the poster boys who continue to inspire youngsters to set up online shops (a la Sachin and Binny Bansal of Flipkart, Vijay Shekhar Sharma of Paytm, Kunal Bahl and Rohit Bansal of Snapdeal and many, many others).


The statistics are getting quite interesting and really, really serious. An Assocham-PwC study projects that the Indian e-commerce industry will grow from $17 billion at present to $100 billion in five years' time--at an impressive CAGR of 35%. The bubble part, and I do believe there is a bubble part though the whole sector cannot possibly be a bubble, will likely put some pinpricks and bring that estimate down to a more sober level. Even then, it will be a multibillion dollar sector in respectable double digits.


Interesting as the stats and the daily dose of funding infusions are, this post is not about them. It is about a behind-the-scenes look at how the technology heads at some of the niche e-commerce firms in India are keeping the clicks on. As CIOs/CTOs and enablers of the business, what challenges do they face in different aspects of running an e-com business? Are they under pressure and if so, to what extent? What tech tools and strategies are they using to keep up in this fiercely competitive and hot domain?


While the spotlight may be turned for the moment on the Flipkarts, Snapdeals and Amazons of the world, I believe it will be several mid-size companies, specialist e-com players (like those featured in this post) and other ecosystem partners (the suppliers, logistics firms, affiliates, etc.) who will collectively make the e-commerce industry come into its full diversity and bring it into a sustainable trajectory for the years to come.


So, let's learn what the IT honchos of, and have to tell us.


"Being a startup, we wanted to start small and then scale up, so we chose to host our infrastructure on the cloud," says Suyash Katyayani, CTO, (which tags itself as "India's beauty destination") when I asked him why they chose cloud (Amazon Web Services) for most of their IT needs. "Thus far, AWS has proved to be a good choice for us, as it offers a lot of flexibility and we can quickly spring up and close test servers whenever we want. The billing is pay-per-use, which suits us. While at this stage of our growth, we can probably afford to spend on the infrastructure but there are costs other than the bare infra costs that we would have to bear, including managing the infrastructure and hiring skilled resources."


And one of the major challenges Suyash faces is in hiring good technical resources, especially in Mumbai, where the company is based.


While he has tried to automate most of the processes at Purplle, some challenges remain. Says he, "I have realized that there are few problems in automated processes, there are certain things such as customer returns and logistics which are very difficult to fully automate."


For that purpose, the company has developed its own in-house software. But could that have been done without?


"If we were to use a packaged application for that, we would require a lot of people to perform all the associated processes. So in the current setup, we have assigned 2-3 tasks to one person and are thus optimizing our resources. Here again, handling exceptions in returns, something which is not written in black-and-white or which gives birth to unforeseen situations--now, that often proves to be a challenge," he explains.


Nevertheless, as the company goes through more situations and handles more orders as well as returns, its ability to handle the exceptions is also improving.


Another challenge is presented by the need to perform constant quality checks and to ensure that the product that is shipped by vendors is of the same quality and quantity as per the invoice. In fact, Purplle has chosen to deploy its own home-grown ERP.  This, says Suyash, is not the norm in e-commerce, but it is working "just fine" for the company.


For a niche e-commerce player like Purplle, it is very important to engage its customers deeper than the typical horizontal e-tailer that operates largely on price alone. How does the company do that?


"While we also offer the best price to our customers, we actively encourage them to share their experience and review our products or services they have used," says Suyash. When I press him some more, he puts it succinctly: "We are not a large e-commerce player like Flipkart; we are not a technology company either; we are a beauty company that knows how to use technology to sell and serve our customers."


Similar sentiments are echoed by Dheeraj Sharma, Head - Offline & Online Operations, Logistics and User Interface, including Technology, at The site is described, as the name suggests, as an online marketplace for Indian crafts and craftsmanship.


"I think most e-tailers are focused on fashion and electronics but these two do not encompass the entire e-commerce space. So there will be a lot of scope for niche players," he says.


Like Purplle, Craftisan also uses a mix of in-house and outsourced technology (the site is hosted in Singapore by Digital Ocean, which also doubles as a content delivery network). The company uses Magento Webstore e-commerce platform and the customizations required are handled in-house.


"The mix of in-house and outsourced IT infrastructure is working out very well for us, because we can keep within the cost constraints. We can also do any customizations remotely on our site," says Dheeraj.


One of the big challenges he currently faces is in managing a lot of offline interactions with its 700-800 vendors. "We are evaluating CRM solutions such as Unicommerce or Vinculum, which will allow us to scale up and take those interactions online. Using a solution like that, we would just give the vendors user names and passwords and they will be able to upload their product information directly on our platform," he says.


Another challenge: The company uses services from different logistics companies such as Fedex, Blue Dart and DHL in different locations globally--but it has to use multiple systems to manage deliveries. "We want an integrated solution that can give us ease of use and visibility across different logistics vendors," he says. Unfortunately, there don't seem to be many solutions out there and he's still contending with the problem.


To build customer loyalty and improve site stickiness, Craftisan uses many re-marketing tools and services from US-based companies that help it send targeted emails to customers who have abandoned the shopping carts midway and did not finish the transaction. "Through these tools, we can give them discounts or other attractive offers. We also incentivize customers on completing our surveys or filling up the feedback forms," says Dheeraj.


The tactics seems to be working: 30 to 35% of people who shop on are returning customers, he says.


For Aditya Chaturvedi, Co-Founder & CTO of, which specializes in lingerie, one of the key challenges is to strike a balance between operations and sales on a day-to-day basis. In a traditional environment, one wouldn't think much of it, but when you are growing 20-25% month-on-month, it does alter the situation.


"Given that the operations are very complex and I get a lot of requests from multiple departments, one needs to plan and execute very well," says Aditya. "We sometimes make processes that make sense in the conference room but which turn out to be ineffective or not feasible in the real world."


The good thing, he says, is that he gets constant feedback quickly and everyone in the company is forward-thinking and confident that "technology can solve their problems."


Again, for Clovia, the IT setup is a blend of in-house and outsourced solutions--with mobile apps being the most frequently outsourced.


For its hosting needs, the company recently shifted from Netmagic (India) to AWS (Singapore); with the latter, he believes, they can upscale and downscale more easily and there's more overall flexibility.


As its e-com platform, the company decided to pick up an open source shopping cart and build it up for custom use. It uses Satchmo, an e-commerce framework built on the free and open source Django web application framework.


When asked how technology can help the company build differentiation, he says, "We keep asking questions of ourselves and our customers as to how can we serve their needs better. For instance, bra sizing is a complex problem in our industry segment and we use online tools to offer different fits and options to the customers. Another innovation in the "office mode cum privacy mode" for viewing the site. This keeps most of the images on the site grey until the user hovers over a particular picture, which then turns full color."


Such technical features, he says, are very much appreciated by many users who might be shopping for lingerie in an office or public environment and do not wish their screens to be splashed with stuff that could otherwise be embarrassing for them.


The company uses a mix of analytics tools to keep track of customer activity on its site, including those developed in-house, Google Analytics and a popular web and mobile analytics tool called Mixpanel. These help them study the pattern and cycles of customer purchases and do targeted marketing campaigns based on those patterns. This lends a contextual flavor to the emailers and other customer communications, including SMS.


So, how do these CTOs feel about working in the fast-paced, challenging environment of e-commerce, the expectations of other CXOs and the future in this industry?


To begin with, they all feel the expectations in e-com are above what one would normally encounter in other industries (though in varying degrees). Yet, they are almost unanimous in their excitement and positivity toward the sector.


"In e-commerce, business requirements are not always clearly defined. The business model or logic can change quickly, depending on how the top management has analyzed a given situation or certain data. For instance, one can choose to change the business model from sourcing a product from multiple vendors to just one or two vendors. This can be quite challenging to implement in a limited timeframe, considering that we source ten to fifteen thousand products every day," says Suyash of Purplle.


Nevertheless, he says that working in e-commerce is indeed very exciting and "we need to be very agile to do our job."


The next big milestone for Purplle, whose shipments will increase from around 4,000 a day in January this year to as many as 10,000 a day by another 8-10 months, is to reduce the average shipping time from one-and-a-half to two days to same-day delivery.


Clovia's Aditya, too, is quite confident both of the sub-segment his company is in and the overall e-com space. Says he, "We are not in the GMV game (GMV = gross merchandise value) unlike other large players in e-commerce. Besides, we see ourselves more of a brand than a retailer. We can turn profitable any time we want if we control our growth, which is very high right now."


The hyper-growth scenario and feverish competition in this sector, he says, is because a lot of investment has gone into e-commerce in the past 3 to 5 years--probably the equivalent of what went into the physical retail space in a period of 10 or perhaps more years.


For his part, Aditya believes that there's still a lot of white space left to cover even within a niche segment such as lingerie. "It is very exciting to work in a startup environment," he says. (In fact, he has been more or less a "startup guy" having played leading roles in the tech function at three companies when they were just starting off: GlobalLogic, 3CLogic and hCentive.)


In the e-com space, you are often required to play multiple roles, feels Dheeraj of Craftisan. "In a startup environment, things move much faster and you often have to wear different hats within the organization. For example, I also play the role of the HR in helping the company align people's expectations in line with business needs and goals. Some people may be hired for marketing, but may need to be moved to a different role, say, social media," he says.


With Craftisan for the past three years, Dheeraj has worked with the Smile Group and (which shipped products only outside India). He also had stints with traditional companies like British Telecom and HCL.


When asked to compare working in an established company versus in e-commerce, he says both have their pros and cons. "But the kind of explosive growth that e-commerce is going through in India at the moment, you cannot get anywhere else. The field will continue to grow at least for another 5-6 years, as there's still a lot of action yet to happen," he opines.


As the action rolls on, we at will bring you more stories from behind-the-scenes in e-commerce. Watch this space.

Image courtesy:

Data Driven.jpg

Data, data everywhere--not just the right kind to make effective decisions! It wouldn't be wrong to assume that this is the common lament in most enterprise decision-making circles today.


On the one hand, companies are drowning in an unprecedented flood of data, structured as well as unstructured. And, on the other, CIOs, CMOs and other CXOs are struggling to get a handle on all that data, put it into the right perspective, extract and massage it into a usable form and take quick, effective decisions. The ones that can earn their firm the much-prized moniker of an agile business or a data-driven enterprise.


While making decisions in any enterprise involves a whole battalion of executives, LOB heads, managers, supervisors and many others, I think the job of enabling the whole organization to take decisions based on analytics rather than hunches (and perhaps, lunches) is most suited to the CIO. The reason is simple: who else has an across-the-board view of the data ecosystem of the company? And that too with the additional knowhow of how the information systems work (or can be made to work)?


So, without further ado, here are five ways CIOs can enable an environment for adaptive, data-led decision making in their organizations:


Making speed count: I know one thing for sure: organizations of all stripes today collect all sorts of data. Through all sorts of forms. By making innumerable number of calls to customers and prospects. And by sources such as the usual enterprise data captured through ERP and other operational systems. But how fast are you with the data you collect? Does it lie buried into file cabinets or dusty disks? Simply putting the data to quick use can make a huge difference to the organization. Following up on a hot lead in quick succession of the data collection process, for instance, will translate into revenue; too much delay, on the other hand, will make the prospect turn to your competitors.


Knowing your data from your metrics: This may sound simple to some and unnecessarily complicated to others. Yet this article on the Harvard Business Review site illustrates the difference and the significance of the difference quite clearly. Authors Jeff Bladt and Bob Filbin cite in the article the example of a YouTube video, asking the reader to guess as to how many views would qualify a video as a success. Now, the particular video in question had garnered 1.5 million views but it failed to do what it was supposed to do: encourage young people to donate their used sports goods. So, despite the impressive views, only eight viewers signed up for donation--with zero finally making the donation!


Not all results (or metrics) will turn out to be in such low extremes. But the point is well-made: you need to specify clear metrics in any data collection or numbers related exercise that will reliably give the true measure of success for the initiative.


Data is data is data, right? Wrong: When data is to be put at the heart of decision-making in an enterprise, it matters all the more that the data be accurate, consistent and timely. So, one may be under the impression that all the data required for a project, say, a marketing campaign, is available, if the data quality is not up to the mark, the results of the campaign would certainly be below expectations.


According to a data quality study by Experian Information Solutions, 32% of U.S. organizations believe their data to be inaccurate and further, 91% of respondents believe that revenue is affected by inaccurate data in terms of wasted resources, lost productivity, or wasted marketing and communications spend. If that's the case with such a data-rich economy, one could imagine how bad the shape of things would be in a country like India, where data collection and research are relatively new fields and far from being mature scientific disciplines. In this context, the need for best practices as well as tech tools in maintaining high data quality cannot be over-emphasized.


Democratization of analytics: How many of you can remember the era of generating sporadic MIS reports for the consumption of the privileged few? Well, that era is long gone. However, most companies are still chary of sharing key statistics or analytics data beyond the confines of top or senior mid-management. But gradually, this state of affairs, too, is set for a bold change. Some call the coming wave as the democratization of data or analytics, in which actionable data percolates to the lowest links in the organizational hierarchy.


Having said that, democratizing data does not mean dropping a huge spreadsheet on everyone's desk and saying, "good luck," as Kris Hammond, Chief Scientist at Narrative Science points out in this article. On the contrary, he explains what it involves simply and emphatically: "Democratization requires that we provide people with an easy way to understand the data. It requires sharing information in a form that everyone can read and understand. It requires timely communication about what is happening in a relevant and personal way. It means giving people the stories that are trapped in the data so they can do something with the information."


Point well made: unless people can take "informative action," the analytics tools or the extracted data will have little value for the people or the organization.


Analyzed this, have ya? Now visualize that, too: I'm not sure if you noticed but the Internet has been flooded with a new tool of information dissemination in the past couple of years. It's called the infographic. For most of your searches on Google, there are now an eye-load of infographics, those illustrative diagrams that give you the needed information with icons, pictures, graphs and anything non-text.


Much less noticeable but equally important, a similar movement is underway within enterprises in the context of data analytics. Vendors such as Tableau Software and Qlik Technologies are leading the charge in this emerging segment, referred to as the visual analytics market


According to specialist consulting firm Atheon Analytics, visual analytics "brings together computer science, information visualization, cognitive and perceptual sciences, interactive design, graphic design, and social sciences." (To see the power of visualized data in action, watch this slightly old but enormously impactful video, the Joy of Stats, of Swedish statistician Hans Rosling, who is often referred to as the "Jedi master of data visualization.")


The above are only a few of the multiple ways in which CIOs can bring the hidden power of data to the forefront of organizational ability and agility. There are plenty of tools and technologies available but each organization must find its own best-fit path to data-driven success. The key is to start the data journey as early as possible and do so in right earnest.

Will Unified Data Protection Find Traction with Indian CIOs?

| No Comments


Enterprise backup has been a pain in the neck for CIOs and IT operations managers for long. As the storage environments grew more complex in where data is stored (tapes, disks, etc.) and how it is accessed, retrieved and managed--so did the headaches associated with backing up the data and being confident of restoring it quickly in case of a calamity.


In the early days, enterprises met most of their archival needs through tape-based backups taken periodically, say, once a day, week or even month. However, as backup volumes grew and as organizations' dependence on current data for continuous business operations became more critical, restoring data to a meaningful recovery point objective (RPO) proved extremely hard.


There came disk-based backup technologies such as Continuous Data Protection (CDP) which provided more confidence in the event of a disaster, especially with data integrity verification, but these were considered resource-hungry (causing application performance issues) and, usually, quite expensive.


In this context, the concept of Unified Data Protection (UDP) has been catching on among enterprises and there are dozens of vendors out there in the market who offer these solutions.


According to Ganesh Kuppuswamy, Director for India & SAARC at Arcserve, a provider of DR and backup software solutions (including UDP), managing the enterprise storage infrastructure is "a huge challenge" for most CIOs due to the growing complexity in the product mix of servers, storage boxes and operating systems in their environment.


The use of a UDP, he says, not only reduces that complexity and makes manageability simple, it also saves on the need to hire multiple admins for different backup and data protection systems that otherwise require different skill sets.


A UDP works, as the term suggests, by unifying backup across tapes and disks, in addition to data replication for the purpose of disaster recovery. "UDP provides upward scalability and removes vendor-stickiness," says Kuppuswamy, referring to the proprietary ways in which traditional backup software have worked thus far.


It seems to me that there's quite a tussle among vendors in the UDP market, what with one company apparently trying to trademark the term.


To provide a "proper definition" of UDP and "mitigate the confusion" around it, a post mentions: "We can define "united data protection" as using snapshots plus Changed Block Tracking (CBT) to provide near-continuous backup. That provides for quick restore, which some vendors, in their enthusiasm, call "instantaneous." Experience suggests no outage is going to be instantly rectified; after all, it takes a little time to size up the situation in the case of a disk failure.  Let's just say it is a many-fold (sic) improvement over the old weekly, daily, incremental backup strategy that exposes a business to a long down time in the case of catastrophic failure."


The excitement and buzz around the UDP segment is understandable. According to an edited report on about Gartner's 2015 Magic Quadrant for Enterprise Backup Software and Integrated Appliances, "by 2016, 20% of organizations, up from 12% today, will employ only snapshot and replication techniques, abandoning traditional backup/recovery for the majority of their data; by 2018, 50% of organizations will have augmented with additional products or have replaced their current backup application compared to what they employed in 2014; and by 2019, there will be a 60% increase in the number of large enterprises opting to eliminate tape for operational recovery."


With the tendency of Indian organizations to stretch the use of legacy systems much longer than elsewhere, how do you think will the backup scenario change here? And which vendors are likely to lead the UDP/enterprise backup market in the next few years? Do write in with your crystal-gazing views.


Are These Exciting Times for a Retail CIO?

| No Comments

Retail CIO.jpg

The retail sector in India has been on a rollercoaster ride for the past few years and will continue on this trajectory for some time. According to the report "Retail 2020: Retrospect, Reinvent, Rewrite" by Boston Consulting Group (BCG) and the Retailers Association of India, overall Indian retail sales are expected to grow to $1 trillion in 2020 from $600 billion in 2015.


There is hardly any surprise in this humongous growth. Post-liberalization, the so-called great Indian middle class has been on a spending spree, buying not only groceries and clothes (the primary stuff) but also a whole lot of lifestyle and luxury items--be it furniture, fashion jewelry, personal accessories, smartphones, ultra-hifi televisions and what have you.


It has been most interesting, of course, to see the story of large format retail (LFR) or organized retail unfold in a country that is considered highly value-conscious and peculiar/complex in its purchasing habits. Denoting this segment as "modern trade," the BCG report projects it to grow by as many as three times--from $60 billion in 2015 to $180 billion by 2020.


However, over the past few years, as the bigwigs of Indian retail (the Rahejas, the Birlas, the Biyanis and the Ambanis) vied with each other to grab prime real estate and competed to build larger and glitzier malls in the suburbs, it became clear that retail is indeed a long-haul game. And one in which the toplines may grow with wider presence but the profits may not follow footfalls. As per an estimate by the credit-rating firm Crisil, the accumulated losses of food retailers in India crossed $2.2 billion in FY 2014-15. In the context of actually making money, the story of other categories in big retail is not much different.


The situation is causing transformational changes in an industry that is barely a decade-and-a-half old in India. One side of the transformation is to do with retail chains becoming more financially and "physically" prudent--by cost optimization and cutting down the store size to small-and-medium rather than large or very large. (Shoppers Stop, for instance, is said to be reducing the size of many of its 80,000 sq. ft. HyperCity stores and plans new ones only in the 30,000 - 50,000 sq. ft. range).


And then there is the technology side of the transformation story, which is what we are most concerned with at It is also the side that we find more exciting and that can potentially tip the scales to a more favorable position for organized retail.


Retail CIOs, like their counterparts in other industries, have been putting the basic IT infrastructure in place for some time now--the networking nuts and bolts, enterprise applications, supply chain management, CRM, etc. These have thus far helped them run the operations on a large scale with adequate efficiency and just-about-enough forward planning.


However, the increasing business pressure and rising customer expectations, especially in the face of strong emergence of e-commerce, have compelled most retailers to embrace what is called an omni-channel strategy. By transforming themselves into omni-channel enterprises, retail businesses aim to offer consumers a superior shopping experience irrespective of their mode of buying--be it through physical stores or through online sites. What's more, the omni-channel strategy also takes care of the multiple devices increasingly in use by consumers, including desktops, tablets and smartphones.


Built into such an omni-channel strategy is the notion that buying behavior has undergone a sea change in today's super-connected and hyper-mobile world. Consumers may choose to research a product, talk about it, experience it, eventually buy it and even say a thing or two about the purchase on multiple forums (social media or otherwise). And retailers that are able to capture all those interactions, sift through the clutter and deploy IT tools that make the best use of actionable intelligence will master the art of omni-channel (and rake in some profits as well).


Seamless customer experience, real-time or near real-time analytics, and clinching the sale at just the right moment all figure highly in the omni-channel scheme of things. If customers are no longer loyal to a single medium, why not embrace the whole buying landscape? That's going to be the new success mantra.


In her ThinkWithGoogle article titled "Omni-Channel Shoppers: An Emerging Retail Reality," Julie Krueger, Industry Director of Retail at Google, writes: "As digital continues to touch every step of the customer journey, multi-channel retailers who operate both e-commerce and in-store channels are having to take note. They're changing the way they think about omni-channel shoppers (for instance, Banana Republic customers who shop both online and in-store) and what their shopping behavior means for the overall business. The most sophisticated retailers are ensuring their marketing strategies are geared toward enabling customers to convert on any channel. Why? Because they realize that a shopper who buys from them in-store and online is their most valuable kind of customer. According to a 2015 study by IDC, these shoppers have a 30% higher lifetime value than those who shop using only one channel."


According to Paul Coby, CIO at UK retail major John Lewis, which has won industry awards for its omni-channel strategy, the company followed a Track-Know-Manage approach (track every item, know the customers and manage everything across all the different channels). Among other things, it began putting in place a new order management system that covers all orders irrespective of whether they're made in stores, online or via mobile. It is also replacing its previously siloed supply chain systems with a new ERP system.


Echoing the need, retailers worldwide are busy understanding, embracing, executing or perfecting an omni-channel strategy. The transformational change is being driven by the omni-channel shopper, not much unlike how traditional IT is getting redefined by the consumerization of technology.


A curious thing about the Indian retail market, however, is that while big-box retail ruled in the developed world for many, many decades before feeling threatened by online upstarts like, India seems to have leapfrogged most of those years. Here, the giant brick-and-mortars have come face to face with the Flipkarts and Snapdeals and scores of other deep-funded e-tailers way too early in their maturity curve. They had barely cracked the code on preventing stock-outs when they were brought up to speed on shopping carts and "if you liked this, you might also like that." Hardly had they learned to use the new analytics tool when they realized that the unstructured data their tool did not capture (was not meant to capture) was also important from the standpoint of understanding customers (and influencers). They are learning new tricks when they were just about to master the old ones.


So retail is at an interesting crossroads in India, where the old-world charm is melding with the new-world allure. Can you make some predictions about how the bricks-and-mortars and the e-commerce brigade will play it out over the next few years? Do drop us a line or, if you are a retail CIO, share your own journey and experience.

IoT in Healthcare: Are Cost and Complexity Prohibiting Rollouts?

| No Comments

Healthcare IoT.jpg

In may last blog "India's Healthcare Sector: Will Challenges Become Opportunities," I discussed how the sector is becoming an attractive destination for investment and how the investments are also being utilized for creating tech-enabled healthcare.


While the momentum is great and almost all large hospitals have covered the basic technology landscape like Electronic Medical Records (EMR), Picture Archiving Communication System (PACS), ERP etc. Now, the next phase of technology adoption and experimentation is slowly trickling in the health enterprises. The key areas where these new technologies are being experimented and/or adopted are remote patient monitoring, OPD, Doctor enablement and so on.  


Every CIO in the health enterprise is today looking for creating a responsive, system-driven organization - one that offers patients the convenience and ease of treatment on one hand and keep doctors updated on the move on the other.


DynamicCIO's recently concluded survey identifies the key technologies that the CIOs are focusing on. A large number of CIOs (36%) CIOs suggested they are adopting one or more elements of SMAC (Social, Mobile, Analytics & Cloud) in their respective organizations. While the number of organizations which claim to have a matured set up is small (5%), it is still very encouraging to know that they are moving in the direction to become a truly digital health enterprise.


The data collected suggests that most of these implementations are taking place in Cloud, Mobility and Analytics. The Internet of Things isn't really on the radar of CIOs in its current form. Merely 10% CIOs said they are willing to experiment or are doing some work using IoT in their organizations.


On this point, I personally called a few CIOs from health enterprises just to find who's doing what and how and specially with the use of IoT.


Kapil Mehrotra, Head IT at Artemis Hospital in Gurgaon is pretty enthused about IoT and its usage in the patient care. Artemis is currently piloting sensor-based devices for critical cardio and cancer care patients who need constant monitoring. His logic is simple. The use of IoT not only provides critical patients a safety net but also saves them from expensive in-patient procedure. "These devices, unlike in the past, aren't cumbersome. They are SIM-based so no issues of wireless connectivity too. They are connected to a 24/7 monitoring center in the hospital that issues an alert the moment the threshold is breached. This gives the doctor and patient enough time to take proactive steps," says Kapil, who is in process of rolling this out anytime soon. The end user devices are procured from Singapore and are quite reliable and inexpensive.


Rajesh Batra, CIO of Mumbai-based Kokilaben Dhirubhai Ambani Hospital is also quite a bullish on IoT and is ready to test in real world. He has discussed and zeroed-in on a certain technology that could prove a boon for the patients and doctors alike. "We spoke to BPL in India for handheld devices to be used in Home/Elderly care. These devices transmit the data to a central server which can press an emergency switch for Ambulance in case its mandated," explains Rajesh. But Rajesh also feels that the success rate of such experiments will be slower than expected. He attributes this to many reasons like higher cost of end user devices, poor quality mobile network and data loss. "It's the beginning and this will take time to mature and become a popular stream," he feels.


When we talk of IoT, what comes first to our mind is a popular device like Google Glass. It's a very futuristic stuff and if commoditized, can create wonders in the patient care. Niranjan Kumar Ramakrishnan, CIO of Sir Gangaram Hospital in New Delhi is hoping that this innovation will come cheap some day and then they can roll out a full blown plan for the doctors. "We planned to pilot it with some doctors and were ready to see if that works. Technology was great and could have been very useful but piloting itself was so costly that we had to drop the idea," says Niranjan.


"This could be a big innovation for Google but to use it in real life and at a mass scale, we need to justify the cost and at this stage it is prohibitive. We are happy with some frugal innovations to build up for the bigger one. Hopefully some day we will be able to roll out Google Glass for doctors and they can use it for the patient rounds without having to carry or ask for bulky files," he says.


Some unverified estimates suggest that nearly US$ 750 billion is wasted in healthcare costs alone in a year. This is all due to the administrative overheads and burden on the hospitals. If IoT or sensors based hospital equipment are mainstreamed, it will certainly help cut this waste, and will lead to better proactive monitoring and better care.


While there is no doubt that similar to how IoT and M2M can revolutionize the manufacturing sector, it can do wonders in health enterprises too. According to a new report from, the healthcare Internet of Things market segment is poised to hit $117 billion by 2020.


A Verizon report published recently says that Organizations will introduce more than 13 million health and fitness tracking devices into the workplace by 2018. The report mentioned that IoT-enabled wearable devices can also play an important role in preventive medicine. By encouraging the population to lead a more healthy lifestyle, the incidence of obesity and other conditions that can be a factor in many serious health conditions can be reduced. Wearables can also help general practitioners to make sure that patients are sticking to activity plans.



But to make that happen is a matter full of complexity and integrating it together with all sorts of disparate systems in the healthcare ecosystem is even bigger a challenge than what we anticipate.


I am sure there are more CIOs than mentioned here who are working on various IoT-based innovations in their enterprises but the issues will be more or less similar mentioned here.


Do you have any better story to share? I am keen to listen. 

Single Sign-on Services for Cloud: Are You In?

| No Comments

Single Sing-on.png

It is hard to imagine that just four, maybe five, years back, there was such an aura around cloud computing, especially in India. I remember the discussion in CIO forums that revolved around the basics: So, what exactly is cloud computing? What is the difference between public cloud and private cloud? And hey, what's this new cumulus in the sky called hybrid cloud?!

A good couple of years were spent demystifying the concept of the cloud and what it entailed and the different flavors it could be tasted in. But once the adoption curve began and the no-legacy-attached entrepreneurial ventures got off to their lightning starts on a cloudy backbone, there was no looking back.

Today, the conversation is around: What applications have you taken to the cloud already and which ones are lined up for the future? Or, which cloud provider gives you the best bang for your buck? How easy is it to manage a cloud SLA? And so on and so forth.

Before long, companies of all hues began using multiple apps on the cloud:, HR applications, expense management, backup services, DR in the cloud, and what not.

The profusion of cloud apps and services has engendered a new breed of vendors that provide single sign-on to those cloud apps. Also known as cloud identity and access management services, the players include established vendors (CA, IBM, EMC, Microsoft and others) as well has new kids on the block (OneLogin, Centrify, Okta, Ping Identity and, as per an online source, hundreds more!)

Gartner tracks this segment in its famous Magic Quadrant reports. According to its recent MQ for IDaaS (Identity and Access Management as a Service) report, by 2019, 25% of IAM purchases will use the IDaaS delivery model--up from less than 10% in 2014. IAM or Identity and Access Management is the overall area that subsumes IDaaS as a sub-segment. A Research and Markets estimate suggests that the IAM market is expected to grow from $9.16 billion in 2014 to $18.30 billion in 2019, clocking a CAGR of nearly 15%.

As with other high-action domains in tech, IDaaS has lots in store for both vendors rushing to sell the solutions and for CIOs willing to embrace them for reducing their ID/apps management burdens.

The Indian market, too, is opening up to this new segment, as more and more companies are now using multiple applications on the cloud, says Raghav Bhaskar, the Founder-CEO of a home-grown IDaaS provider, AppsPicket. The company focuses on simplifying access management for various business cloud apps; all that employees have to do is just register a particular app once through AppsPicket's common interface. From then on, a special two-factor authentication process verifies the device being used to access that app whenever it is used. This is quite unlike the OTP-based authentication that most banks and payment services currently deploy in the country. And yet, Bhaskar assures me, the usage is as secure as you can possibly get from any sophisticated solution on the market.

The software works with most frequently used cloud apps such as Amazon Web Services, Google Apps, Box, Office 365, Salesforce, Workday and others.

While all the apps being accessed through AppsPicket are currently secured with 128-bit encryption, Bhaskar says his firm is working on encrypting attachments and other files downloaded through those apps as well. The enhanced security features will then be incorporated into the newer versions.

As and when that happens, that will perhaps obviate the need for many enterprises to go in for separate Mobile Device Management (MDM) tools provided by the likes of AirWatch, Good Technology and MobileIron.

I think at some point in time, IAM and MDM will possibly converge in favor of IT decision makers rather than from a vendor perspective. What I mean by that is that if I were a CIO, why would I need two separate solutions--one for remotely and securely managing the device and another for (again) remotely and securely managing the user identity?

Indeed, the growing trend of BYOD and ever-broadening acceptance of the cloud as enterprise access tools and mechanisms are creating complexities in ID and device management. Already, you have your enterprise access management for on-premises, business-critical applications (largely done through solutions like Microsoft Active Directory). And now there's a slew of vendors fighting for the cloud-based apps and ID management pie.

So, somewhere along the line, as more critical apps and functions are served on "the mobile cloud," there will be a great desire to get all this messy access management equation solved through an elegant, unified solution.

How do you think it will happen, if at all?

Image courtesy:

Mobile ban.jpg
A known Enterprise (name withheld) has banned employees across all its group companies from using mobile phones in office. On July 30, it was extended to offices of all group companies, including.... The thinking behind the move, according to an employee (who did not want to be identified), is that "people tend to misuse social networking sites inside office. Popular chat applications, while used for official purposes can also be used to chat with friends and family. So the management is trying to curb the unwanted use of these platforms during work hours".

One step forward, two steps backward, or did I get it wrong, two steps forward, one step backward! Either way it does not matter, reality for the hapless employees of Enterprise X is that one of their primary sources of collaboration and customer insights is no longer available to them. Everyone is bewildered; what was the trigger that caused it? Did employees protest? Did they voice their anguish to the Management? Did the quality of work improve after the draconian law was passed? Did output actually go up?

On the same page of the publication was another report: "No one really doubts the importance of digital transformation to today's businesses. Our global survey found that few can deny the mounting internal and external pressure to do more than pay lip service to digital. Yet, digital adoption is not achieved without changing the behavior of the business. Management must be convinced of the business case, a thorough implementation plan must be in place, and enterprises must alter and revamp internal processes, technology, and staff to make the digital transformation".

The founders had held on to control despite the next generation driving the business; the younger inheritors were adept at technology and current with the times however unable to challenge the occasional unilateral and irrational directions the company took. Were the business interests of Enterprise X immune to Digital disruption ? The answer is a vehement no; in fact they had led digital innovation from the front. Then something changed and they started lagging behind competition with the margin increasing with time.

Discussions on clamp down on social media and mobile enablement are in the past. Digital aspirations of enterprises changed the dialogue, while benefits from mobile computing driven by consumerization of smartphone, created new opportunities for information dissemination and customer engagement. Better telecom networks also enable geographically dispersed workers to stay logically connected thus improving collaboration and productivity. In such an environment the concentration camp mentality raises curiosity to get to root cause.

Reading the report verbatim the purported trigger was work hours being utilized for personal interest; it raises an interesting question on the dividing line between what are defined work and personal hours. For most work hours extend beyond the time spent in the place of work; laptops changed the paradigm two decades back, the smartphone redefined it a little later. Globalization of enterprises extended work hours beyond the conventional 9 X 5; whatever limited time that was remaining disappeared in the economic troughs.

Is this Enterprise giving a message that staff needs to clearly demarcate work and life hours? Are they saying don't indulge in any activity that can be construed as personal at workplace and similarly when they step out of the hallowed premises they shall not be expected to cast a cursory glance at the official email? Or for that matter if the workload increases, the employees will not be expected to spend extra hours to complete tasks at hand! An internal circular demanded people to deposit phones on the way in and take them when leaving the premises. OMG!

Everyone would nod their heads to the fact that attention span of the new generation has significantly reduced. Messages and notification craving have people checking their smartphones with increasing frequency; distractions apart, reality is that everyone is also connected to the enterprise 24x7 with rigor and passion unknown in the past. Weekends and vacations have laptops, tablets and smartphones as constant companions seeking wireless networks. With the work and life disconnected it's anybody's guess if withdrawal symptoms will emerge post working hours.

Maybe this Enterprise  is path breaking with this new policy; only time will tell ...

This blog is reposted from Author's personal blogsite with prior permission.

Image courtesy:

It was a big day for the company, every function had planned for many months. The last time around which was actually the first time, the company had fallen flat amidst a huge uproar by customers who were left dissatisfied and unfulfilled. Learning from the fiasco, they had planned carefully looking at each and every component distributed among the three streams which had to work together for collective success. This was their last opportunity to redeem themselves; the team was on tenterhooks, they badly wanted success.

D-day arrived; there was complete harmony between them as if they had telepathic connectivity; intelligence of the highest possible order ensuring that when one requested a resource, it was made available within fraction of a second. Together they ensured that no one was starved of any resources with what appeared to be limitless elasticity. The datacenter team had solved the problem of planned and unplanned demand peaks. They had learned to manage high variability and surge, and now could think of life beyond work.

Who comprised the magic team that found the silver bullet and solved the problems that every datacenter head and CIO struggle with ? Triad of the most critical resources for any application to work: network, compute and storage ! With a combination of Automation, Orchestration, Central Governance, and Dynamic Workload Placement, the ultimate dream of a Software Defined Datacenter - that is application aware and ensures that no resources are ever constrained irrespective of workload - had been achieved.

The journey started almost two decades back with Server Virtualization quickly evolving to allow spread from one server to going across multiple boxes within the rack, then data center and finally the Cloud. With unlimited resources running on commodity hardware, it's now possible to react to sudden and planned upsurges. Cloud providers have built capacity that collectively can address the entire world's needs on steady state. Economies of scale have been achieved by many; the decision to build dedicated datacenters is now an emotional and not rationale decision.

Software Defined Networks have evolved to allocate resources based on workloads managing capacity to optimize. They are application aware and prioritize traffic based on neural network based learning; the disruption to existing hardware players had them acquiring some of the new innovative startups. Finally hardware independent storage virtualization and automation brings up the rear having evolved last and still facing some challenges of interoperability threatening stranglehold of dominant players. Collectively SDDC has arrived and gaining momentum!

Software Defined Datacenter is a complex architecture of hardware, network, compute, storage, and unified software that is application context aware working within the boundaries of defined SLAs and configured for business. The abstraction layer thus defined, delivers everything as a measurable software configurable service. The business logic layer is intelligent to identify trends and constraints overcoming them in an interconnected world of on premise datacenter and Cloud (compute, storage, and applications).

In contrast the current traditional datacenters focus on discrete components - hardware and devices - which require individual configuration, provisioning and monitoring. Depending on the maturity curve, enterprises have adopted various parts of SDDC in their environment, whereas the mature service providers have come closer towards the creation of the software defined datacenter. Interoperability and transition between competing SDDC propositions still remains a challenge with proprietary and still evolving software stack.

So where should IT and CIOs focus or place their bets ? Can they create the magic described above within their datacenters ? Is it feasible with permissible or marginally higher investments ? Which vendors should they bet on and how to manage the existing stack of hardware and software ? Does it make sense to just outsource the entire proposition and let someone else manage the solution and associated headaches ? What about skills availability to create and then manage the complexity that SDDCs bring?

I believe that the answer lies with the existing scale and intricacy; SDDC can bring high level of efficiency to large datacenter operations where the variability of demand coupled with dynamic scale provisioning will bring efficiency and potential cost savings over the long run. The other scenarios that make a case are business with unpredictable load and variability (e.g. ecommerce portals, news sites, stock exchanges, and social media). Smaller companies can derive benefit from parts of the solution with high levels of virtualization.

Either way, start building components that can help you move towards SDDC.

Ah yes, a hint of spring is in the air. The days are getting longer and warmer. Soon I will be able to indulge in one of my favorite pastimes, fishing. Fishing, as a sport, is always a challenge. Fish respond to a multitude of factors like water temperature, type of forage, sunlight and ever-changing stream conditions. One must weigh a variety of factors in order to be successful in actually catching a fish. Simply put, one must think like a fish to catch a fish. You need to assess the physical environment, the water temperature, amount of sunlight, stream conditions, the aquatic insects that serve as food and the depth of the water.

So why is protecting your data any different? Data protection, like fishing, is a challenge. The amount of data is never static or unchanging. Those whose job, primarily CIO's, it is to architect a coherent data protection strategy must take in to consideration a wide range of factors to cost-effectively manage and protect their data as well as ensure a consistent, repeatable recovery strategy. Are they managing physical, virtual or cloud infrastructure? Is the cloud infrastructure public or private? What are the recovery time objectives (RTO) and recovery point objectives (RPO)? On top of these considerations is the application environment they must support.

In today's dynamic IT environment, CIO's must weigh and asses a variety of considerations particularly when the volume, velocity and veracity of data growth is unabated. The acceleration of virtual servers and applications make the task of architecting a coherent protection and recovery strategy extremely challenging in today's world. Protecting data is a must; ensuring timely and consistent recovery is no longer a luxury. Firms that do not have clearly defined protection and recovery strategies in place run the risk of being at a competitive disadvantage. You see, data is the new currency. Any data loss is unacceptable and potentially detrimental to a firm's bottom-line, credibility or market value if it is a public company. Exposure to any data loss is a risky proposition for an IT administrator but particularly a CIO whose job it is to safeguard their firms' critical assets.

In this new era of highly virtualized infrastructure, CIO's must have architectures in place that are flexible and can span virtual, physical in addition to cloud environments. Data protection and recovery becomes a significant challenge especially when critical applications such as Exchange or SQL Server need to be adequately protected. The application environment is ever-changing and now highly virtualized. Moreover, application owners want the shortest possible RTO's and RPO's. They also want the ability to have self-service recovery to insure they can resurrect their data rapidly in the event of an unforeseen outage. As a result, greater emphasis is placed on rapid, repeatable recovery of applications and data. More importantly, a flexible and consistent disaster recovery needs to be in place. This places enormous burdens on CIO's to bridge legacy infrastructure, applications and processes to ensure recovery is consistent and repeatable as well as scale to cover a greater number of applications.

In essence, a CIO needs new tools and processes to ensure recovery in this new era of rapidly virtualizing, cloud-connected environments that have expansive and unabated data growth. Today, the process of protection, recovery and disaster recovery makes catching a fish look very easy indeed. CIO's must consider solutions that provide greater alignment of their business objectives and allow them to reduce costs, save administrative time and reduce their exposure to data loss and unnecessary risks. The same old tools and processes will simply not protect many organization's critical data assets and ensure rapid recovery in the event of a disaster. CIO's are now in an unenviable position to safeguard their organizations data. If they fail, they may be asked to go on an extended fishing vacation or jump off a short pier.

This is a sponsored blog by Dell.

Image courtesy: Dell


Big Data + Big Analytics = Big Insights

| No Comments
Bigdata 1.jpg

In "Big Data, Big Analytics and You", I wrote:

"Big data is obviously important to most organizations. There's plenty of data out there and even more data being generated every day. But big analytics is just as important. Without the ability to analyze the data you have at the speed and scale it is being generated, your organization will never really be able to fully take advantage of big data."

I wanted to follow up that post with another that talks a bit more about the importance of analytics and converting data into actionable insight.

If you ask business executives if they'd prefer more data or more insight into their business, most would (and should) say that they want and need more insight into their businesses. Some people might argue that in order to get more insight, you need more data but in my experiences this is far from true.

More isn't always better. More data doesn't deliver more insight. Businesses do not need more data, they need to be able to use data better. Once an organization figures out how to analyze data more effectively to gain the insights they need, only then will more data make a difference.

Data itself is interesting but useless until you do something to find and understand the 'signals' contained within. Until you convert your data into information you have nothing of value. Until this conversion happens, you've done nothing but waste money on collecting and storing a whole lot of nothing.

In order to turn this 'nothing' into something, companies must find ways to turn find the signal within the enormous amounts of data. This signal will then lead to gaining information, knowledge and, ultimately, wisdom. This is where 'big analytics' comes into play because in order to truly find value in big data, you must analyze that data at scale. Sure, you can use excel or some other simple approach to try to dig through your data but excel won't cut it for the large amounts of data that most organizations need to analyze.

Companies need to analyze at scale to find the insights that their executives need and want. This requires the right analytics tools and systems, the right people with the right skills and a culture that allows people to dig into whatever data they feel is necessary to find answers (and new questions).

Neither big data nor big analytics is the answer to today's business problems but they are the start to finding many answers that a business needs to find as well as finding some questions that organizations didn't know they had. It won't be easy and it can be expensive, but if you are truly looking for insights into your business, there's no better way to find those insights than by combining a good big data strategy with a good big analytics strategy.

Big data and big analytics can provide big insights for any organization willing to put the time and effort into building a big analytics practice.

This blog first appeared on Eric's Blog. It is reposted here with prior permission. For more blogs, please read his blogsite

Image courtesy:


All this while, industry has been talking about smarter ways to Exchange backups.

Why? Well, to be honest, they can be a pain.

If you're the "computer guy" (or girl) for your company, you probably spend lots of time messing with Exchange backups, and backups in general.

In a couple of previous articles we talked about the quite primitive backup method known as offline backups, and the slightly more advanced Native Windows Server backup. Both have advantages and drawbacks; they may be right in some situations, but you need to be aware of the limitations.

A Step Up For You?

Today, let's talk about file- and item-based backups. This technology is favored by most third-party backup software products. And it's a step up from the methods we've already talked about, in some ways.

Want advantages? Here's a big one: It backs up not only the Exchange database but also individual items in the database such as email messages and contacts.

The benefit here over the previous two backup types we discussed is that you can restore individual email messages. This single-item recovery ability improves restore times in a big way (and can make your life a lot easier when the boss accidentally deletes an important email).

This is especially true when you have the backup files stored on fast disks instead of slow magnetic tapes.

Want another benefit of this backup type? You can use it for both physical and virtual machines.

But not so fast...

You figured there had to be some drawbacks right? Right.

Drawback #1: your data is essentially being backed up twice. This takes more time.

Drawback #2: you pretty much have to do the individual items backup during your off-hours backup window. It's not practical for you to have the backup program querying the server many times during the workday looking for new emails and individual items to back up.

So you still need to plan on having the file- and item-based backup run during a time when the Exchange databases can be taken offline.

That may work just fine if you have a small "mom and pop" operation, but for a big company with teams working around the clock? Not so much.

Drawback #3: with this method, if you don't run backups until after the workday is over, you end up with a lot of emails at risk.

Drawback #4: like the offline and native Windows Server backups, file- and item-based backups don't give you an automated verification of the backup's accuracy. You'll have to run a manual recovery of a selected file to test the backup data's validity.

Although in some ways, this backup technology is a big step up from the previous two we discussed - the ability to restore an individual email is nothing to sneeze at, and can be a lifesaver when you need it - in other ways, it's still sadly lacking.

We Want To Help You!

As always, before you choose an Exchange backup and recovery solution, consider your particular situation from all the angles:

  • How big is your company?
  • How important is email to you?
  • How important is it to be able to restore individual emails?
  • Do you have an available consistent "backup window" when you can take Exchange offline?
  • What's your budget for a backup solution?

Need help making your decision? We're here for you. We've put together a valuable guide you can have for free. It's an eBook called Six Ways To A Smarter Microsoft Exchange Backup.

This is a sponsored blog by Dell.

Image courtesy: Dell



In a scenario where, by the year 2020, over 100 billion devices will be connected to the web and businesses will have to manage 50 times the data they are managing at present, Digital Technologies, (Social, Mobile, Analytics & Cloud) will have a multiplying impact on businesses and increase productivity across the organization.

Core Quadrant, research arm of, undertook two massive research studies during January-May 2015 timeframe to ascertain the importance and vitality of Digital Technology for both enterprise users and technology producers. The research found that 50% of the IT growth in the year 2020 will come from the Digital Technologies alone.

The Core Quadrant Research findings also establish that while today nearly 83% of Indian enterprises are in the initial stages of experimenting with the digital technologies - in the silos - they are expected to move towards the full digital play with the help of matured underlying IT platforms in the next 3 years.

Kapil Dev Singh, Principal Analyst, Core Quadrant, who spearheaded the research says that the digital enterprise is an idea, which is fast taking wings now. It is made relevant by the need for agility amid the contemporary business challenges and enabled by the advent of a plethora of digital technologies. "The journey of enterprises to embrace digital demands an all rounded focus on the competitive landscape, strategic choices, leadership competencies, process capabilities, enterprise culture and the technology platform," he says. 

The research further finds that the Digital technology spend by the Indian enterprises and government shall touch US$10 Billion by 2020. It means a cumulative spend of US$37 billion between 2015 and 2020.

The findings of the research indicate that nearly 50% of IT growth in 2020 shall be contributed by digital, up from 34% in 2015.

One out of every four dollars spent on IT, both in enterprise and government, in 2020 will be on digital technologies (up from 1 out of 7 in 2015).

Another interesting finding that came out from the research was that mobility rules among the SMAC pillars. Nearly 65% of the enterprises undertook mobility as an initiative in the last 2 years (2013-2015). 

Highlighting the downside of the preparedness the research findings found that only 1 out of 9 enterprises' IT platform today is adequately prepared for driving big digital initiatives. Although 60% of enterprises have overhauled their IT infrastructure and core applications in the last 2 years (2013-2015) to make them more adequately prepared for the digital initiatives, but they aren't fully prepared yet.

The full research will be available to CIOs and senior IT decision makers in India online.


Cloud computing is the future of computing. Specialization and outsourcing make society more efficient and scalable, and computing isn't any different.

But why aren't we there yet? Why don't we, in Simon Crosby's words, "get on with it"? I have discussed some reasons: loss of control, new and unquantifiable security risks, and -- above all -- a lack of trust. It is not enough to simply discount them, as the number of companies not embracing the cloud shows. It is more useful to consider what we need to do to bridge the trust gap.

A variety of mechanisms can create trust. When I outsourced my food preparation to a restaurant last night, it never occurred to me to worry about food safety. That blind trust is largely created by government regulation. It ensures that our food is safe to eat, just as it ensures our paint will not kill us and our planes are safe to fly. It is all well and good for Mr. Crosby to write that cloud companies "will invest heavily to ensure that they can satisfy complex...regulations," but this presupposes that we have comprehensive regulations. Right now, it is largely a free-for-all out there, and it can be impossible to see how security in the cloud works. When robust consumer-safety regulations underpin outsourcing, people can trust the systems.

This is true for any kind of outsourcing. Attorneys, tax preparers and doctors are licensed and highly regulated, by both governments and professional organizations. We trust our doctors to cut open our bodies because we know they are not just making it up. We need a similar professionalism in cloud computing.

Reputation is another big part of trust. We rely on both word-of-mouth and professional reviews to decide on a particular car or restaurant. But none of that works without considerable transparency. Security is an example. Mr Crosby writes: "Cloud providers design security into their systems and dedicate enormous resources to protect their customers." Maybe some do; many certainly do not. Without more transparency, as a cloud customer you cannot tell the difference. Try asking either Amazon Web Services or to see the details of their security arrangements, or even to indemnify you for data breaches on their networks. It is even worse for free consumer cloud services like Gmail and iCloud.

We need to trust cloud computing's performance, reliability and security. We need open standards, rules about being able to remove our data from cloud services, and the assurance that we can switch cloud services if we want to.

We also need to trust who has access to our data, and under what circumstances. One commenter wrote: "After Snowden, the idea of doing your computing in the cloud is preposterous." He isn't making a technical argument: a typical corporate data center isn't any better defended than a cloud-computing one. He is making a legal argument. Under American law -- and similar laws in other countries -- the government can force your cloud provider to give up your data without your knowledge and consent. If your data is in your own data center, you at least get to see a copy of the court order.

Corporate surveillance matters, too. Many cloud companies mine and sell your data or use it to manipulate you into buying things. Blocking broad surveillance by both governments and corporations is critical to trusting the cloud, as is eliminating secret laws and orders regarding data access.

In the future, we will do all our computing in the cloud: both commodity computing and computing that requires personalized expertise. But this future will only come to pass when we manage to create trust in the cloud.

This essay previously appeared on the Economist website, as part of a debate on cloud computing.

To read the Part One, Click Here, and to read the Part Two, Click Here.

This three-part blog was first posted on Author's Blog. It is re-posted here with prior permission and is fully attributed to the Author.

Image courtesy:


Let me start by describing two approaches to the cloud.

Most of the students I meet at Harvard University live their lives in the cloud. Their e-mail, documents, contacts, calendars, photos and everything else are stored on servers belonging to large internet companies in America and elsewhere. They use cloud services for everything. They converse and share on Facebook and Instagram and Twitter. They seamlessly switch among their laptops, tablets and phones. It wouldn't be a stretch to say that they don't really care where their computers end and the internet begins, and they are used to having immediate access to all of their data on the closest screen available.

In contrast, I personally use the cloud as little as possible. My e-mail is on my own computer -- I am one of the last Eudora users -- and not at a web service like Gmail or Hotmail. I don't store my contacts or calendar in the cloud. I don't use cloud backup. I don't have personal accounts on social networking sites like Facebook or Twitter. (This makes me a freak, but highly productive.) And I don't use many software and hardware products that I would otherwise really like, because they force you to keep your data in the cloud: Trello, Evernote, Fitbit.

Why don't I embrace the cloud in the same way my younger colleagues do? There are three reasons, and they parallel the trade-offs corporations faced with the same decisions are going to make.

The first is control. I want to be in control of my data, and I don't want to give it up. I have the ability to keep control by running my own services my way. Most of those students lack the technical expertise, and have no choice. They also want services that are only available on the cloud, and have no choice. I have deliberately made my life harder, simply to keep that control. Similarly, companies are going to decide whether or not they want to -- or even can -- keep control of their data.

The second is security. I talked about this at length in my opening statement. Suffice it to say that I am extremely paranoid about cloud security, and think I can do better. Lots of those students don't care very much. Again, companies are going to have to make the same decision about who is going to do a better job, and depending on their own internal resources, they might make a different decision.

The third is the big one: trust. I simply don't trust large corporations with my data. I know that, at least in America, they can sell my data at will and disclose it to whomever they want. It can be made public inadvertently by their lax security. My government can get access to it without a warrant. Again, lots of those students don't care. And again, companies are going to have to make the same decisions.

Like any outsourcing relationship, cloud services are based on trust. If anything, that is what you should take away from this exchange. Try to do business only with trustworthy providers, and put contracts in place to ensure their trustworthiness. Push for government regulations that establish a baseline of trustworthiness for cases where you don't have that negotiation power. Fight laws that give governments secret access to your data in the cloud. Cloud computing is the future of computing; we need to ensure that it is secure and reliable.

Despite my personal choices, my belief is that, in most cases, the benefits of cloud computing outweigh the risks. My company, Resilient Systems, uses cloud services both to run the business and to host our own products that we sell to other companies. For us it makes the most sense. But we spend a lot of effort ensuring that we use only trustworthy cloud providers, and that we are a trustworthy cloud provider to our own customers.

This essay previously appeared on the Economist website, as part of a debate on cloud computing. 

To read the Part One, Click Here

This three-part blog was first posted on Author's Blog. It is re-posted here with prior permission and is fully attributed to the Author

Image courtesy:

Bruce Cloud.jpg

Yes. No. Yes. Maybe. Yes. Okay, it's complicated.

The economics of cloud computing are compelling. For companies, the lower operating costs, the lack of capital expenditure, the ability to quickly scale and the ability to outsource maintenance are just some of the benefits. Computing is infrastructure, like cleaning, payroll, tax preparation and legal services. All of these are outsourced. And computing is becoming a utility, like power and water. Everyone does their power generation and water distribution "in the cloud." Why should IT be any different?

Two reasons. The first is that IT is complicated: it is more like payroll services than like power generation. What this means is that you have to choose your cloud providers wisely, and make sure you have good contracts in place with them. You want to own your data, and be able to download that data at any time. You want assurances that your data will not disappear if the cloud provider goes out of business or discontinues your service. You want reliability and availability assurances, tech support assurances, whatever you need.

The downside is that you will have limited customization options. Cloud computing is cheaper because of economics of scale, and­ -- like any outsourced task -- ­you tend to get what you get. A restaurant with a limited menu is cheaper than a personal chef who can cook anything you want. Fewer options at a much cheaper price: it's a feature, not a bug.

The second reason that cloud computing is different is security. This is not an idle concern. IT security is difficult under the best of circumstances, and security risks are one of the major reasons it has taken so long for companies to embrace the cloud. And here it really gets complicated.

On the pro-cloud side, cloud providers have the potential to be far more secure than the corporations whose data they are holding. It is the same economies of scale. For most companies, the cloud provider is likely to have better security than them­ -- by a lot. All but the largest companies benefit from the concentration of security expertise at the cloud provider.

On the anti-cloud side, the cloud provider might not meet your legal needs. You might have regulatory requirements that the cloud provider cannot meet. Your data might be stored in a country with laws you do not like­ -- or cannot legally use. Many foreign companies are thinking twice about putting their data inside America, because of laws allowing the government to get at that data in secret. Other countries around the world have even more draconian government-access rules.

Also on the anti-cloud side, a large cloud provider is a juicier target. Whether or not this matters depends on your threat profile. Criminals already steal far more credit card numbers than they can monetize; they are more likely to go after the smaller, less-defended networks. But a national intelligence agency will prefer the one-stop shop a cloud provider affords. That is why the NSA broke into Google's data centers.

Finally, the loss of control is a security risk. Moving your data into the cloud means that someone else is controlling that data. This is fine if they do a good job, but terrible if they do not. And for free cloud services, that loss of control can be critical. The cloud provider can delete your data on a whim, if it believes you have violated some term of service that you never even knew existed. And you have no recourse.

As a business, you need to weigh the benefits against the risks. And that will depend on things like the type of cloud service you're considering, the type of data that's involved, how critical the service is, how easily you could do it in house, the size of your company and the regulatory environment, and so on.

This essay previously appeared on the Economist website, as part of a debate on cloud computing. It's the first of three essays. The next two parts will be published subsequently.

This post first appeared on Bruce Schneier's Blog and is reposted here with his prior permission and directly attributed to the author.

Image courtesy:


To tweak a quote from the classic Sergio Leone film The Good, the Bad and the Ugly, "there are two types of people in this world: people who save time and people who fail." 

In data protection, and indeed all of IT, saving time on data backups is critical and cannot be wasted. As a theoretical matter, time is the platform on which everything in your computing environment runs -- with each passing second, there is more data; there is more activity happening in your organization that needs to be routed, categorized, saved, backed up, accounted for and attended to.

But as a personal matter, time is equally important. Let's face it, we would all rather be doing other things -- personal or work related -- than tending to an overly complex or unnecessarily labor-intensive backup environment.

So here are six ways to save time on a daily basis in your backup and recovery environment.

The Good

Good time-saving strategies involve looking toward the future. Ideally, you want to save time now and position yourself to save time down the road by:

  1. Embracing new technology. Specifically, investigate products that allow you to automate previously time (or resource) consuming processes. Current examples can include compression and deduplication, but vendors are always innovating. Stay up to date on the latest technologies that may be able to help you now and in the future.
  2. Planning for the future. In your day-to-day life, you need to have a firm grasp on your current environment, and you need to install processes that achieve your goals now. But do not overlook the fact that data growth is a near constant in today's IT world. You will need to grow, and you will need to scale. It's a fact, and it can't be overlooked. Planning for the future will keep you from scrambling when you reach certain milestones.

The Bad

Bad time-saving strategies involve short-sightedness. Like a bad fast-food meal, you may feel full after eating it, but you will soon be hungry again, and that meal provided nothing but empty calories. You should avoid:

  1. Passing the buck. Especially if you are part of an organization that has instituted multiple IT teams for different functions, try to avoid passing problems or potential problems to other teams. While taking more responsibility may seem like it takes more time, in the end, you can ensure whatever needs to be done is being done right the first time, preventing the need to go back and redo it the right way.
  2. Sandbagging. A popular term in the sales industry, sandbagging is when you hold off on executing a particular tactic immediately in order to close at a time that is viewed as more beneficial by executives. For example, if you know you are going to need to expand your storage capabilities to comply with a new compliance regulation that will go into effect in six months, don't wait. Sandbagging until the last minute always results in unnecessary stress and some kind of inevitable and unexpected problem that ends up costing you more time.

The Ugly

Bad and ugly strategies are pretty similar, though for the sake of discussion, let's define ugly strategies as those that make you look bad, be it in the eyes of your executives or your customers. This includes:

  1. Chasing features. The flip side of the good strategies, being overly aggressive with introducing new, specific features in your environment can lead to a very complicated system. A dozen different vendors, applications and platforms jammed together usually means increased complexity. Unnecessary complexity means unnecessarily wasted time.
  2. Doing nothing. Of course, the ultimate strategy that will make you look bad is to bury your head in the sand and avoid solving problems and challenges as they arise. It will certainly save you time in the short run, but it may be so effective that it gives you all the time back in your day, which of course means you will soon be spending that time filling out unemployment paperwork.

You want to be good, right? You certainly don't want to be ugly! For additional tips and details on ways to save time and money on backups, check out this ebook

This blog was first posted on Dell Data Protection Blog and is re-posted here with prior permission.


As a result of new and advanced technology, significant transformation has been made to the way enterprises operate. Business units, particularly software and web development teams, but also marketing, HR and finance, want to take advantage of the new generation of cloud services, meaning that an interest in cloud technology is no longer solely the domain of the IT team. The consequence of this is that every business unit leader has their own IT solution in mind, rather than waiting for the CIO to deliver against a set of requirements. This is leading to the CIO and IT department losing the control it is expected to have, and then having to deal with the pitfalls that come from having systems and data outside of the company's infrastructure. Finding a solution that meets the needs of all stakeholders is one of the greatest challenges facing enterprises today and one that will demand that the role of the CIO and IT department changes forever.


The convergence of cloud computing and business-critical applications is having a profound effect and is perhaps the most transformative technology trend affecting today's workplace. Employees want instant access to their tools from any location, at any time, and so they increasingly want to use cloud-based solutions. These cloud systems offer unlimited scalability of storage and data-processing capabilities, helping business-critical applications to run more efficiently than ever before. Businesses are consequently open to a wide range of growth opportunities, being more able to respond to dynamic market conditions and competition, serve new geographies, and to rapidly develop new products and services.


Despite the benefits that cloud computing offers the business, the IT department are understandably nervous about allowing business units access to the wide range of self-service solutions they need. Due to budget limitations and corporate policies, the IT department end up acting as a gatekeeper and restricting the services to which employees can have access.


As a result, employees are procuring their own solutions in order to meet their specific business requirements without waiting for the IT department. According to Centrify, more than two-thirds of organizations admit that unauthorized cloud applications are being implemented without IT's knowledge or involvement. This "shadow IT" could pose a significant threat to an organization's information security and availability, potentially impacting its revenue. According to Gartner, 35% of organizations' technology budget will be spent outside of the IT department by 2020. Shadow IT will continue to grow unless business units and departments are provided with the flexible tools they seek as part of the services IT can offer.


Each business unit is faced with differing technological requirements. A software team building a software application to sell in the market usually requires a set up of several servers, storage and networking, i.e. 'QA environments'. Software development teams could greatly benefit from the agility of creating instances in a short space of time, the efficiency of not having to manage and maintain the infrastructure and paying only for the infrastructure they need.


Web content teams also greatly benefit from the cloud. It opens up resources from multiple internet-connected devices with lower barriers for entry. The cloud gives them access to hosted applications and data along with cloud-based development services, allowing content teams to create web applications that have remote access to data and services like never before. Cloud-based developments allow developers to build and host web apps on the company's own cloud server, speeding up the development and deployment processes.


The major challenge facing the IT department is to regain control and enable employees to work flexibly with applications that meet their requirements, whilst also making sure the company's data is secure and that the CIO can deliver the compliance and reporting requirements demanded. In order to do so, the CIO and the IT department need to bridge the gaps between IT's need for control and business units' need for flexibility by offering flexible and dynamic self-service solutions. By doing so, IT becomes an enabler of business change rather than a barrier to it, and by choosing the best hybrid cloud platform to offer both on-premise and cloud solutions, their organisations will avoid pitfalls which can be created by using variety of different systems and cloud offerings within a single organisation. CIOs need to be able to understand that the demands placed by each business unit to become stronger, more innovative and competitive require this service-based approach. This in turn will position the CIO as a business enabler, supporting greater efficiency and innovation to fuel growth and increased revenue, while also future-proofing the organisation's technology infrastructure to maintain a competitive edge.


CIOs must immediately turn their attention to future proofing their business and creating a unified vision by choosing an appropriate hybrid cloud infrastructure that meets their business' requirements, before the business beats them to it.  They need to embrace shadow IT within the enterprise rather than trying to shut it down. As CIOs continue to be pulled in different directions by employees, cloud providers and customers, it's clear that a single cloud strategy is no longer sufficient in meeting a variety of needs. Hybrid cloud - combining public cloud infrastructure and external cloud services with private cloud deployments and on-premise IT systems - can provide flexibility, structure and security, and enables the CIO to gain control over the IT environment, assuring data governance, but also enabling the various areas of the business to evolve with the right technology to support them.  Now is the time for the CIO to shake off the gatekeeper reputation and step confidently into the role of ultimate business enabler, with hybrid cloud facilitating this evolution.



This blog was first published at and it is reposted here with prior permission. This is directly attributed to its Author. 

Image Courtesy:


In information technology and software development, how many of our wounds are self-inflicted?

Here's what I mean.

What I've seen happen, recurringly, in IT over the years and decades: idealistic but inexperienced people come along (within IT itself or in other departments within a company), to whom IT systems and issues seem to be easier than they in fact are. They are smart and earnest and oh-so-self-assured, but they also seem blissfully unburdened by much real understanding of past approaches.

They are dismissive of the need for much (if any) rigor. They generalize often quite broadly from very limited experience. Most notably, they fail to understand that what may be effective for an individual or even a small group of developers often doesn't translate into working well for a team of any size. And, alas, there's usually a whole host of consultants, book authors, and conference presenters who are willing and eager to feed their idealistic simplifications.

Over the years, I've seen a subset of developers in particular argue vehemently against any number of prudent and long-accepted IT practices: variously, things like source code control, scripted builds, reuse of code, and many others. (Oh, yes, and the use of estimates. Or, planning in general.) To be sure: it's not just developers; we see seasoned industry analysts, for reputable firms, actually proposing that to get better quality, you need to "fire your QA team." And actually getting applauded by many for "out of the box thinking." Self-inflicted wounds.

But let's talk about just one of these "throw out the long-used practice" memes that pops up regularly: dismissing the value of bug tracking.

How can anyone argue against tracking bugs? Unbelievably, they do, and vigorously.

A number of years back, I came into a struggling social networking company as their first CTO. Among other issues, I discovered that the dev team had basically stopped tracking bugs a year or two before, and were proud of that. Did that mean their software had no bugs? Not when I talked to the business stakeholders, who lamented that nearly every system was already bug-riddled, and getting worse by the release.

Why did the development team shun bug tracking? That wasn't quite so clear.

They told me dismissively that Bugzilla (their former, now-mothballed bug tracking tool of choice) had just "filled up with a lot of text". They cited all sorts of idealistic Agile justifications: if you really want to address a bug, you should do it right away, they opined, otherwise it's not really important anyway. Index cards make the bug more visible than burying it in a tracking system. And so on. They showed me with great pride how they'd set up a light to turn green when all the automated tests passed. And hey, the light was green. In short, they were in deep denial about the proliferation of bugs. And no tracking meant no real data existed to contradict that claim, just people's anecdotal impressions and vocal complaints.

But then, looking around the team's work area, I found hundreds of index cards for various bugs, some of them many months old, randomly scattered in piles behind various workstations, scribbled with cryptic notes. No one knew for sure which bugs had been fixed, worked on, dismissed, etc. Least of all the stakeholders. And the end customers complained endlessly about bugs to the company's customer care representatives, who then felt that no one internal really listened to them as the "voice of the customer."

Recurring, self-inflicted wounds

So it's déjà vu for me when I read extreme, wrong-headed tweets like this:

Various tweets

This isn't just a one-individual disconnect or blind spot. It's distressingly common among software developers and (often) their management: the mistaken but dogged belief that we'll find all bugs, and we'll just fix them as we find them. But that simply doesn't happen on non-trivial systems or projects. As Joel Spolsky noted,  "programmers consistently seem to believe ... that they can remember all their bugs or keep them on post-it notes." But developers can't, and they don't. And anyone who has worked in a development shop of any reasonable size knows that developers can't, and that even if they could, there are numerous compelling reasons to more formally track bugs anyway.

Half an hour of surfing (see the Lagniappe below) and a little brainstorming will get you a list of powerful reasons to do bug tracking more formally than using scattered index cards:

What you lose

Here are just a few of the things a team loses when there's no use of bug tracking software:

  1. Information is lacking that is needed for troubleshooting the issue and confirming a fix. Index cards rarely contain the full scenario needed to understand and resolve the bug: specifically, the detailed steps to reproduce it, the expected result, and the actual result. Instead, you tend to get scribbled notes like "the footer sometimes gets wonky in the checkout module."
  2. Other than leaning on "tribal knowledge," (essentially, talking to individual heroes and gurus on the team), developers have to start from scratch every time an issue comes in. They can't easily leverage what's already been solved.
  3. With just "tribal knowledge" and no searchable database, it's hard to tell if a newly reported issue is a known problem in an unusual guise, or a completely new problem.
  4. It's difficult, lacking a historical database of bugs, to identify a pattern in the bugs that might suggest a more holistic fix than just addressing each individual bug.
  5. Lacking a searchable database of bugs and the ensuing record of discovery/research on them, it's much more difficult to transfer responsibility for a given bug from one developer to another.
  6. Users have no clear process for ensuring or confirming that their problem has been recorded.
  7. Users noticing software issues can't easily look to see if those issues have already been reported.
  8. In any normal business environment, people tend to submit issues multiple ways: phone, email, paper, even a hallway conversation. When those issues are not consolidated in one place, it becomes remarkably easy for issues to fall off the radar.
  9. Having no system of record for problems worsens business relations and causes randomizing of the team: in particular, submitters tend to repeatedly request personal updates from someone they know. 
  10. With no central place for updates, managers can't get much of the overview they need to make tactical and strategic decisions. So instead, they can spend a lot of time tracking down individuals to gauge where things stand.
  11. Other than shuffling laboriously through a stack of cards, no one can easily understand how many known bugs are left to fix, what the severity of those bugs is, etc.
  12. Release readiness evaluations can't take into account the rate of bug discovery, which can be key to deciding if the software is truly launch-worthy.
  13. A scribbled index card can't be easily consulted later to find out what was the solution.

And to reject all these solid reasons (and yes, there are lots more) out of hand? Or to even opine, as one vocal Twitter poster actually did on this topic, that "memory is waste"? Well, there goes IT again, with another arrogant, amnesiac, self-inflicted wound. Bug tracking as a software development practice is tried and true, with far more upside than downside (in short, not all overhead is waste, contrary to what you'll be told). Yet some firms inexplicably allow teams to reject bug tracking nonetheless, and that's the problem. In fact, as cited here,

"Much of the blame for [software project] failures [lies] on management shortcomings, in particular the failure of both customers and suppliers to follow known best practice. In IT, best practice is rarely practiced. Projects are often poorly defined, codes of practice are frequently ignored and there is a woeful inability to learn from past experience'."

Bottom line: when you hear people dismissing the need for various long-standing, solidly understood processes in the development of software, it's appropriate to be tremendously skeptical. Unless you want to be part of self-inflicting the next wound.


This blog is reposted with permission from Peter Kretzman. To read Peter's Blog, you can visit:

Image courtesy:

Smart Cities: The Internet of Things and Everything

| No Comments

Smart City .jpg

The whole technology world is going gung-ho about Internet of Things (IoT) lately. In layman terms, IoT is the network of smart devices that can 'talk' to each other. According to estimates, by 2020 there will be over 50 billion smart-connected devices, which means around 7 devices per person will be connected to other devices in the world through internet/cloud. India is not behind in the IoT revolution. Recently, Indian government has announced to build 100 smart cities. But of-course, these smart cities has to be connected with smart devices like transportation (car, public transport), electricity grid, home appliances etc.

Let us take an example to understand how IoT works. Imagine you are driving in a smart city like Delhi (oh yes, Delhi will be soon smaaaart); your car will be fitted with GPS and other signaling devices that would communicate with the other cars in the vicinity (think of parking problems in Connaught Place). The parking sensors in the parking lot will directly communicate with car to tell whether there is a space available and location of the space. The same sensors can provide critical data to the traffic monitoring systems to direct and regulate traffic.

This was the good picture of IoT. Not everything is great as far as IoT is concerned. "Imagine a world where you could turn on your porch light from the office or unlock your door for a visitor, all from a smartphone app. Well, like a growing number of early smart-home adopters, I have seen this future today -- and let me tell you, it's a mess". - Don Conlon, CEO CoCoon.

Few issues pertaining IoT are:

IoT is not hack proof: IoT is based on RFID Tag. There is absolutely zero security for the RFID tags. You don't even have to touch it to hack it.

Privacy: Companies don't make much money in providing IoT as service, but the real money is in the Big Data. Companies want to learn everything you do - when you wake up, when you sleep. How you eat your breakfast, when you watch movie or when your kids gets hungry. By monitoring your patterns, they can sell 'stuff' to you. The real money is learning your behavior and than controlling it. Yes Sir, be scared... very scared.

Security (different than privacy): IoT promises to connect everything .... Your sunglasses, your watch, your coffee machine, your TV, your car and even your pillow. Imagine the wired world where everything is connected. Ahh !! I love that good old Matrix movie, at least somebody saw the future!

Interoperability: The IoT devices are said to be using the open standards. Its been 25-30 years since we started talking about  open standards, but where we have reached? Nowhere. Ultimately there are 2-3 players that dominate the field - Apple, Google, Microsoft, IBM.... Consumers have to take side to adopt one standard and than build the entire technology ecosystem (own personal ecosystem) around it. Most likely, the Google device won't talk to Apple device or Microsoft devices.

There are pundits who say - IoT will increase the consumer choices and make life easy. There is no way IoT problems will be fixed to give control to the consumer. Remember what I said, the big money is not in devices but in the big data. If companies gives consumers the chance to 'choose', they wont be able to control their behavior, hence no money. If there is no money, why the heck big companies build IoT?

Definitely, before we jump on the smart city bandwagon, we should build enough security infrastructures to protect the interests of state and citizens.

This blog was first published at the Economic Times Website.

We repost it here from Author's LinkedIn Blog. 

He was technically sound and had a proven track record creating and managing IT infrastructure in his chosen industry across multiple entities; recently he had moved into a position of power and influence in his new assignment and was enjoying flexing his muscles. Reaching out to vendors and others in the industry he announced his intentions engaging them in discussions that had most of them wanting to be part of future plans. After spending more than two decades in the industry he had aspirations of making it to the corner office.

The new company needed the fresh look he brought to the table; his hands-on approach is what was required to move them to the next level of efficiency. The company had consistently approached IT as a necessary evil to spend only when absolutely necessary resulting in decade old servers, geriatric laptops and desktops, and teenage neglected applications. The management took a penny pinching view of IT budgets always wanting to defer, delay or procrastinate on decisions after squeezing the proverbial last drop from vendors.

His demeanor driven by professional knowledge bordered on loud and brash, at times tended towards arrogance. To his credit he built relationships and trust quickly with the management getting their ear and then toeing the party line; with his ability to manage relationships and not being disruptive to the culture, he became part of the inner circle getting a view to the workings of the company though unable to change the decision making inertia. So he decided to enroll external help which he hoped would trigger positive change.

The hired external consultant provided a reality check and direction to take which he reviewed with his internal peers. Collectively they had a limited view which stalled progress; they knew what needed to be done but did not know how to get started with internal buy-in. Sliding timelines favored no one and the report finally elicited a view from the management chastising him for not delivering what was required. The absence of an articulated and documented expectation, it was like hitting a moving target in the dark!

He attempted to calm ruffled feathers on both sides though finally bowing to the management view with his subservient survival instinct prevailing over his professional pride. Capability and self-belief is a function of experience endorsed by past success. Professionals take a stand with conviction driven by confidence that is built on a foundation of deep expertise in a specific area; shallowness of bravado is quite evident when challenged. He backed off and tentatively offered to build a bridge which would save the situation.

Status quo dragged on for a while with silence and no action leaving the organization in suspension and users becoming restless after having seen a ray of hope. The company used to suboptimal process and technology solution continued to labour with obsolete and incomplete solutions while the industry was fast pacing ahead into a new world of digitally enabled customer engagements. For the newbie, it was a struggle for meaningful existence while wanting to change the outcomes he knew were possible with some of the recommendations.

Any strategy or plan with no resources, will, and buy-in to execute, is waste of time, effort, storage space and paper! Contextually the difficulty was the management's unwillingness to define what is required while expecting not just the necessary but the best of outcomes. There are many avenues to explore and break the deadlock, advice that coaches and consultants can offer to overcome the situation. Conventional wisdom will preach that the service provider to take a step backward and find a way to resolve the stalemate.

If you were the protagonist, what would you do ? Would you take a stand and risk losing your job ? If you were the management, would you give up your high chair and create a tripartite agreement on expected outcomes? And if you were the consultant, how would you resolve the impasse with no access to the management and dependence on IT to find a solution? If we analyse the situation, every stakeholder erred in the beginning by not defining the baseline or setting detailed explicit expectations. It's a lose-lose scenario now!

This blog is reposted here with prior permission from the author and is directly attribued to him.

Image courtesy:

Will Wearable Tech Invade the Enterprise?

| No Comments

Wearable Devices.jpg

Though it's an old news but worth a mention to begin this blog.

With flying becoming less and less glamorous, last year Virgin Atlantic, the premier UK-based global airline along with air transport specialist SITA, decided to test how the wearable technology can be used to enhance customer experiences and improve efficiency to make flying a pleasure once more. So, they took up a pilot project for six weeks with Google Glass. From the minute the Upper Class passengers stepped out of their chauffeured limousine at Heathrow's T3 and were greeted by name, the staff (wearing the technology) started the check-in process. At the same time, the staff was also able to update passengers on their latest flight information, weather and local events at their destination and translate any foreign language information. The benefits to consumers and the business were to be evaluated ahead of a potential wider roll out of wearable tech in the future.

That's a single example from aviation industry. Now read this interesting example of how a Mechanical Engineer sensed a huge business opportunity for a Wearable Start-up. While touring a warehouse in the state of New Jersey last year, Haytham Elhwary, a Ph.D student, realized how demanding and hazardous was the job of the worker moving boxes. At any given minute, over 100 million boxes are moving back and forth across the USA. They're shipped to business and homes, reaching doorsteps and loading docks in days. But these boxes don't move themselves. Day after day, workers in warehouses bend their knees, take a deep breath, and then lift entire businesses up with their efforts. When they're injured, businesses are injured; their families are injured. That's what gave Haytham an idea to set up KINETIC - the start-up, which makes wearable devices for the material handling workers that can significantly reduce lifting related injuries.

The wearable device comes in two parts, a wristband and a back device, which can integrate into the back brace of a worker. The device has sensors, which coupled with the machine learning algorithms, can analyze the bio-mechanics of a worker during a lift and calculate their risk of injury. If the lift has a high risk of injury, workers get feedback on the wristband through a display, a red LED and optionally, a light vibration.

The two examples above are distinct, and very clearly demonstrate two very specific types of usages of wearable technologies in the enterprises. In case of Virgin, they are using a standardized wearable product, which will help boost the customer experience. In case of KINETIC, the wearable device can bring a huge benefit for any organization, which is bothered about employee safety and health.

So, are we now convinced that wearable tech is no more just a sci-fi movies gadget? Are we accepting that it is all set to go beyond just the consumer technology of the future and will have a decisive stake in enterprises as well?

In February this year, Salesforce Research launched an INITIATIVE to know:

  • Who is already using or planning to use wearables in the enterprise? 

  • The value and role of wearables in business -- and how it will evolve? 

  • Which wearable tech types are having the biggest impact? 

  • Common challenges companies face when deploying these devices?

To have credible data, an online survey was conducted with 1,455 full-time business professionals in the U.S., 500 of which are currently using or planning to implement wearable technology. The result of this mammoth exercise was encouraging:

  • A majority of adopters (79%) agree that wearables are (or will be) strategic to their company's future business success.
  • Among current users, 76% report already seeing improvement in their business performance since implementing wearable device technology.
  • Due to the significant success these companies are seeing so far, 86% expect their wearable technology spend to increase over the next 12 months.

The Salesforce study suggests that companies are even embracing bring your own wearables (BYOW), with 54% currently supporting a BYOW model and an additional 40% planning to support this model in the future.

Market growth trends wearables.jpeg

"While wearable technology isn't all that popular and ubiquitous, advertising companies are already conceiving of ways to deliver marketing messages directly to people who sport computerized watches, glasses and headgear. After all, the thinking goes, where there's a screen, there's an opportunity--and if projections are correct that sales of wearables could reach over 130 million units and gross almost $6 billion by 2018, that opportunity is a big one," mentioned a PwC Report.

The market for smart wearable devices using sensors and wireless technology to transmit a data has got a boost in recent years. Juniper Research in its report, "Smart Wearable Devices: Fitness, Glasses, Watches, Multimedia, Clothing, Jewelry, Healthcare & Enterprise 2014-2019" released last year says that global retail revenue will triple in volume by 2016 before eventually reaching $53.2 billion in sales by 2019.

The wearable technology is not a fancy stuff anymore. Some serious usages like Workplace Security Access, Employee Time Management and Employee Communications are on the anvil.

I spoke to a couple of Healthcare CIOs in India and they all are experimenting with SIM-based wearables to track the health of their patients. These wearable devices will work in two ways: One, the sensors will provide the health data to the hospital staff, which can help remotely monitor the patients and two, it gives alerts to the patients on reaching thresholds or reminders.

If the above examples sound abstract, then here's a real-life example:

Desert Valley Medical Center in California USA has deployed the wireless Leaf Patient Monitoring System in its emergency department to help clinicians prevent hospital acquired pressure ulcers. This wearable patient sensor can help medical professionals' efforts to prevent the occurrence of pressure ulcers. One study found that using Leaf sensor increased compliance with hospital turn protocols - a standard of care to prevent pressure ulcers - from a baseline of 64 percent at the start of the trial to 98 percent after the monitoring system was deployed.

Wearable device market growth forecast.png

The Flip Side

Everything has it. Wearables are no exception to the negative side of it. From simple distractions to serious breach issues, wearables can spark anything.

PricewaterhouseCoopers (PwC) did a report titled: "The Wearable Future" that found out 82 percent of the respondents were extremely worried that wearables will invade their privacy. 86 percent of them felt that the use of wearable technologies would make them more vulnerable to data security breaches. As a result of these apprehensions, almost 1/3rd of the wearable buyers (among the respondents) use the wearables intermittently. Google Glass is based on Android and could simply inherit the vulnerabilities that the OS is famous for.

The Security Researchers at Kaspersky Lab are at work on the possible targets of wearables. Roberto Martinez, an analyst with Security firm finds many things in common in the new and current devices. "They use the same protocols and are interconnected with other devices using similar applications. There is no way around this. Traditional attack vectors are mainly against the network layer in the form of Man-in-The-Middle (MiTM), the exploitation of some vulnerability in the operating system, or the applications themselves. Being based on Android, Glass could inherit known vulnerabilities found in other devices with the same OS," says Martinez.

Roman Unuchek, a researcher with SecureList in his blog explains how he hacked his own wearable device. "From just six hours of scanning I was able to connect to 54 devices despite two very serious restrictions," he says. And this happened recently. "In some cases you can connect to wearable devices without the user even knowing it. The fraudster could take control of your wristband, make it vibrate constantly and demand money to stop it," says Roman.

So if someone talks about safety of wearables, it will be foolish to guarantee anything. They are safe until the point hackers find value in hacking them. In case of enterprises, this may soon happen just in case they are used en mass.

It's a classic case of business tradeoff for enterprises. While on one hand wearables are staring at a great future and ready for adoption in the enterprises, some stray cases of data breaches and privacy may force them to defer the idea.

Are you ready for experimenting with Wearables?

Image courtesy:

How IT Enhances Productivity and Efficiency in Retail

| No Comments

Retail .jpg

In today's age of disruptive innovations, technology is playing a far bigger role than ever before. It contributes directly towards the top line and bottom line of businesses.


IT acts as business enabler/driver rather than just as a support function. Further, it is playing a pivotal role in the Retail business as the nature of business is complex with huge volume of transaction data to handle.

Today's customers are tech-savvy and they look for seamless shopping experience across multiple channels. Customers are looking for online access to in-store products, features, inventory information with various flexible options such as in-store pick up, home delivery, cash on delivery etc.


This trend is putting by-and-large all retailers on the job to overcome organizational, operational and technological challenges to provide customers an omni channel experience.


Retailers are working towards integrating their backend technologies, sharing analytics between channels and training in-store employees to provide competitive shopping experience in omni channel environment.


They are investing heavily in innovation and implementing cutting edge technologies to achieve scalability and agility in the diversified retail business to remain competitive and profitable.


Best-of-breed ERP solutions provide centralized control of retail data and provide a 360 degree view of entire retail operations spread across geographies. This facilitates retailers to take informed decisions to optimize their inventory positions and prices, increase stock turns and thus bring efficiencies in the supply chain operation in the dynamic retail business.


Therefore, it is imperative for businesses to choose a technology solution judiciously and after due diligence to achieve scalability, agility and robustness in terms of technological capabilities, keeping future expansions of businesses in mind.

This blog was first published on Author's LinkedIn blog.


Image courtesy: 

Life in 2025: Will Humans be Humans?

| No Comments
Connected world.jpg

Yesterday, for a very brief while, there was an Internet outage. at office Though the problem got rectified in less than 15 minutes, the work almost came to a grinding halt - no mails, no excavations, no work!

It compelled me to think that if today Internet, which isn't as ubiquitous and pervasive, can do this to us, what will happen 10 years from now when we will have 50 billion devices connected to each other (Read Internet of Things or IOT)?

Alas! My brain stopped thinking further.

The famous lines of Eric Schmidt struck my mind: "The Internet is the first thing that humanity has built that humanity doesn't understand...the largest experiment in anarchy that we have ever had."

That took me to an interesting blog posted on the Pew Research Website.

Pew Research asked over 2500 '
experts and technology builders' to talk about their stand on where we will stand by the year 2025?    

Without dragging you into the length of their dialogues (thought they are very enticing), I would simply cut-paste the following few paragraphs to help you understand where is the technology going to take us in next 10 years:

The experts predict an ambient information environment where accessing the Internet will be effortless. Most people will tap into it easily. It will flow through their lives "like electricity." 

They predict mobile, wearable, and embedded computing will be tied together in the Internet of Things, allowing people and their surroundings to tap into artificial intelligence-enhanced cloud-based information storage and sharing.

The experts predict there will be:

  • A global, immersive, invisible, ambient networked computing environment built through the continued proliferation of smart sensors, cameras, software, databases, and massive data centers in a world-spanning information fabric known as the Internet of Things.
  • "Augmented reality" enhancements to the real-world input that people perceive through the use of portable/wearable/implantable technologies.
  • Disruption of business models established in the 20th century (most notably impacting finance, entertainment, publishers of all sorts, and education).
  • Tagging, data-basing, and intelligent analytical mapping of the physical and social realms.

This bogged me down totally and, for a moment, I thought, "Was Internet really invented to do all this?" I always thought it gave humans the power to connect with each other. Who the hell thought that Internet will be advance beyond humans and connect the things?

You never know that in the next 25 years it connects every living being with the non living being. Pardon me for my rhetoric but what we see today is the result of when went on during the last 15 years in technology labs. With that logic, the next 15-20 years will be even more disruptive.

We can expect wearable devices and sensors for animals - not just for tracking them but for their urges, needs, calls! Or you never know there are already some...

Today most things are guided by Internet. In the future, I wouldn't be surprised if most human interaction, especially around health, education, work, finance, and entertainment are fully dependent and governed by Internet. If there's no Internet, there's no education, or health or money.

Are we talking of automated humans? Or are we talking of human robots? What is it?

Look at this website which lists down Top 50 Sensor-based Apps for Smarter World. Today they are 50 tomorrow they can be 500, 5000 and so on. Will humans live life out of an App only?

Internet and its ubiquitous stature has no doubt brought a lot of things closer to us, which were unimaginable about two decades ago. Internet for health, for education, for banking, for many more things has been a boon. We can't close eyes on those massive pros and can't take the credit away.

But the positive side cannot be appreciate in isolation and isn't the only aspect. What's the negative side of it? Perhaps the business world has already started realizing the dangers of over connectivity and with IOT it is going to get worse.

But the social-political nuances of hyper connected world are yet to come to fore.

In the same article on the Pew Research Website, Oscar Gandy, an emeritus professor at University of Pennsylvania, cautions: "We have to think seriously about the kinds of conflicts that will arise in response to the growing inequality enabled and amplified by means of networked transactions that benefit smaller and smaller segments of the global population."

Another interesting observation comes from Llewellyn Kriel, CEO and editor in chief of TopEditor International Media Services. He predicts: "Everything -- every thing -- will be available online with a price tag. Cyber-terrorism will become commonplace. Privacy and confidentiality (of any kind) will become a thing of the past. Online 'diseases' -- mental, physical, social, addictions (psycho-cyber drugs) -- will affect families and communities and spread willy-nilly across borders."

I am not sure whether this is the future we all are ready to embrace!

I would end here with an even nicer quote from Neil Gaiman who said: "Google can bring you back 100,000 answers. A librarian can bring you back the right one."

Both I and Neil can be classified as old economy guys but
the reason to write this blog wasn't to bring negativity but to make everyone aware of the flip side of the story.

The Value of Cyber Threat Intelligence

| No Comments


I just came back from the Next Generation Security Summit in San Antonio, Texas and I have to say that there were two surprising themes - cloud and cyber threat intelligence.  I could speak about the vendors and their unique or specific offerings, but that would dilute the story a bit.  Instead I want to talk about a conversation I had regarding cyber threat intelligence tools and why they are gaining in such popularity.

Making due with less or making less do more

Whenever a tool is purchased for InfoSec purposes there are only two quintessential purposes for doing to (and cool factor isn't one).  The first reason is to lighten the load on an overburdened workforce.  This can be said with the acquisition of any new technology.  The second (and not necessarily the least) is to enhance the capability of your overburdened workforce.  These might seem like similar goals, but trust me when I say they are completely divergent purposes.  One is to keep from having to add staff, the other is to improve the information - basically by increasing the quality and condensing it into a hypervisor or dashboard.

May I please have some more Cyber Threat Intelligence?

Why yes you can! But then again that depends largely on whose definition of cyber threat intelligence you accept coupled with how deeply you are willing to integrate cyber threat intelligence into your InfoSec program.  It's not as easy as it sounds.  To use cyber threat systems adequately you need three things - an organizational lifecycle for gathering, processing and disseminating information; a program for taking that information and turning it into actionable workflows (i.e. turn it into intelligence); and a willingness to separate InfoSec from the typical IT service culture (IT and InfoSec are not the same things and they don't do the same things).

Many organizations with cyber threat programs have already discovered that cyber threat intelligence,from internal and external sources, provides value when it is researched, analyzed and disseminated in a timely manner and when there is a mechanism for sharing information across groups. How is this so.  Well, I am glad you asked.  The following is a short list but by no means exhaustive:

  • By changing the InfoSec model from reactive to proactive you begin to understand your adversaries and thus create tactics to aggressively thwart current attacks, plan better for the ones to come, and gain experience in the processes of battling cyber criminals.
  • Likewise, because you are compressing the security alert time frame you also compress the one key area that adds pressure to respond from InfoSec teams.
  • Your information dissemination greatly improves - especially to non-technical decision makers.  Better, more informed, responses to security incidents can compress the overall life-cycle of an attack - this includes involving vendors at the most appropriate times.
  • Adding to this is the improved communications that normally takes place between the InfoSec team, all levels of management and board members who are ultimately responsible for cyber security.
  • Last but not least is ROI and TCO - often used as buzz terms, the real impact is significant from investment in cyber intelligence platforms.  Legacy security technologies (at least legacy from a cyber threat POV) can receive a breath of new life by funnelling cleaner information that is actionable and real-time.

Clearing out the Noise

So, there are a plethora of explanations about what cyber threat intelligence really is. This often happens with emerging tech partly from the market space trying to define itself, but also because the vendors themselves want to differentiate themselves or control the narrative.  Clearly, like every other area of business, cyber threat intelligence products are not all created equal. The sad truth is also that many vendors have re-branded their older systems with a couple of new features and are calling them cyber threat intelligence systems.  Cyber threat systems are not the same animal as those that feed raw information or other unfiltered indicators that are placed into a repository for another system or a member of the InfoSec team to muddle through for some glint of information.

Many people, including vendors, confuse information with intelligence.  Most InfoSec teams will tell you that more raw information is not what is needed.  We don't need a definition of information vs. intelligence here, I think we all know the difference and if you don't then go ahead and Google it.  It's vitally more important here to understand how human based and technological intelligence can merge to create valuable, timely, accurate and actionable intelligence.  This intelligence can then be mobilized to impact planning and operational concerns surrounding existing and emerging threats.

Energizing the Discussion

Cyber threat intelligence can become a way of energizing internal discussions surrounding cyber security.  Remember, your Board of Directors is ultimately responsible for the cyber security posture the organization takes.  Depriving them of information and expecting them to make informed decisions isn't going to thrill them when the lawsuits start flying in.  For the business as a whole cyber threat intelligence can be a way to engage operational areas within the business when it comes to the relationship between security and the business.  It can change their operating models from reactive and threat-based to proactive and risk-based thus driving more timely and relevant decision making.

This blog was first published on Author's LinkedIn profile. It is reposted here with prior permission and is attributed to him 

Three Surprising Digital Business Trends for 2015

| No Comments

Enterprise Wearables.jpg

PwC publishes an interesting finding from one of their researches accomplished recently. The Consulting Giant in their Digital IQ Survey found that enterprise business and IT leaders are looking for the traditional technology areas like cybersecurity (69%), private cloud (61%), and data mining and analysis (54%) as their top priorities where they will invest. 

While PwC sees these three areas as the ones which the businesses need for keeping their lights on, but they may no more distinguish the respective organisations from the rest. The business leaders in PwC strongly suggest to look to a "trio of picks" that they believe have extraordinary potential, despite being off the radar screen of many businesses. 

In fact, wearables (3%) and NoSQL databases (6%) placed at the very bottom of the priority list for survey respondents, ranking last in investment and perceived value among a field of 24 technologies. While more companies are investing in sensors for collecting business information (23%), they're still underrated in the books and grossly underinvested in by several sectors. 


PwC says that its surprising bets raise important questions for every enterprise, and as with any technology ranking the organisations will consider both shorter - and longer-term bets. 


"The future of digital is promising" is not any surprise. There is a need for CIOs and business leaders to not only take note of the rising influence of digital technologies, but also equip themselves for embracing it. 



For more details and a comprehensive reading refer to the PwC Website

The information of this blog is taken from the PwC website 

Image courtesy:


Enterprise Risk .jpg

I read somewhere about the government's intent to increase budget allocation towards fighting and creating cybersecurity awareness. The link was hidden somewhere towards the bottom of the newsletter; quickly I clicked through to read word by word the good news and realised that it was indeed true! The chart showing CAGR was quite impressive with the trend line going north; then I looked again at the Y axis to find that the investment per annum was so low that the entire news was like actually too scary to be funny.

Not too long ago when I wrote about Creating Secure & Safe Enterprise, many CIOs and CISOs wrote back with their personal experiences; most of them agreed that their realities were reflected within. Some of the interesting facts that emerged is that budgets were a challenge, but then not really a challenge when an incident occurred. With corporate focus on short-term goals and measurement of tactical performance, the biggest challenge that everyone unanimously portrayed was that of sliding priorities with security settling close to the bottom.

Why is security investment such a drag when it comes to budgeting and spending ? Why do enterprises and with that I imply the CXOs who collectively represent the Management believe that they don't really need to invest in protecting their information assets which are family jewels in most cases ? What creates such a lackadaisical attitude towards creating process, policy, and implementing tools that provide a secure framework to do business despite the fact that threats are increasing and businesses are losing customers, revenue and credibility !

Everyone agrees in principle that security is a must; they (the CXOs) espouse this in conferences and project themselves as the messiahs of information protection and security. When one such leader was asked pointed question on the budget allotted, he sidestepped the question deftly instead talking about how the industry needs to up the ante. The lip service that ignores the elephant in the room is beginning to hurt enterprises. The cause for such an attitude towards keeping the doors and windows open has to be deeper.

I am not that big and not an attractive target for anyone ! Why would any hacker want to breach our security ? Our customer data is locked up on one computer and only two people have access to it; they are both trustworthy. We don't have anything worth stealing, so why would anyone compromise our systems ? We know internal threats are higher than external, we have information distributed across multiple solutions, so no one can decipher the full picture; we have locked USB, installed anti-virus and firewall, isn't that enough ?

Is lack of awareness or education creating a false sense of security and complacency ? Or is it a belief that such things happen to others and I am safe ? Is CXO ignorance and indifference an acceptable proposition towards defining the security posture of an enterprise ? When you live dangerously sooner or later an adverse incident does occur and that is when the scapegoat syndrome always ends up pointing fingers at the CIO or the CISO, and/or the service provider. Breaking this paradox is the need of the hour for enterprises.

No one wants to fall sick or die but everyone takes health and life insurance ! Investments in security are like insurance to protect the business. Physical security has seen this paradigm shift with electronic tags and biometric solutions becoming the norm. With the number of threats increasing and new ones emerging, the education of CXOs is not just an imperative but an urgent need. CIOs, CISOs, Internal Audit, and Risk Committees have to own up the information protection agenda and drive it with their collective might.

Using ethical means to understand vulnerabilities and fixing them should be high in the corporate agenda towards creating a safer digital enterprise. Customers and consumers are becoming sensitive to this fact and the probability of them taking their business elsewhere is beginning to happen. A safe and secure ecosystem is required for the extended enterprise including suppliers, contractors, partners, and customers. The writing on the wall is that companies who emerge as secure digital enterprises will be winners of the future.

Where are you ?

This blog first appeared on CIO Inverted, the Author's blog and is reposted here with prior permission. 

Image courtesy:

Complexity - The Killer of Agility?

| No Comments


I've said a few times that the data centre of today isn't the data centre of yesterday nor is it the data centre of tomorrow. In fact, in "The Data Centre of Tomorrow" I wrote that the: "data center will be a combination of internal and external systems that combine to create an agile, efficient and effective technology delivery platform."

While I believe that will be the case in the near future (if it isn't already the case today), the data center of tomorrow has the potential to add complexity to the organisation's IT systems and platforms. This complexity may just be a simple replacement of other types of complexity or it may be adding complexity to the data center. Either way, complexity has the potential to be an agility killer if it isn't managed or planned for correctly.

Complexity has always existed within the data center. From the first day of data center existence, IT professionals have had to manage complexity but in recent years there's been quite a bit of growth in complex systems within the data center. With companies increasing their use of virtualisation within their data centres, connecting data centres with the cloud and implementing new platforms and systems every year, the level of complexity continues to increase.

Without proper thinking and planning, this complexity can have a negative effect on agility within the data center. There are a few things that organisations can do to attempt to manage complexity within the data center while keeping agility at the forefront of the IT group and the data center. A few ideas for managing complexity are:

      • Get visibility into the platforms throughout the organisation to ensure that the IT group understands what platforms the business has
      • Get visibility beyond the platforms to allow IT to understand the business processes that are driving platform changes
      • Ensure open communication channels between all groups within the business to ensure when a new platform is needed or wanted, IT is informed and involved in the decision making process
      • Have a proper business technology strategy that drives all technology projects.
      • Build a technology council and invite members from all areas of the business to allow different opinions and insights into the technology strategy of the organisation

As you can see from these ideas may not seem that great at first, they are a starting point for understanding the technological systems and platforms within the company. By understanding your platforms, you can understand the complexity that exists (or might exist in the future) and help keep agility alive.

This blog was first published at Author's Blog and is reposted here with prior permission. For more blogs from the author, Click Here

Image courtesy:

Smarter Cities Okay But What About Smarter Citizens?

| No Comments

Smart Citizen Kit.jpeg

Just yesterday I read an interesting (though very basic) news report on IBN Live about a Smart City being built on the banks of River Sabarmati in the state of Gujarat. Why not? The Prime Minister of India has a special liking for the concepts like "Smart City", "Digital India" "Smart Governance" etc. and has an ambitious plan to convert 100 Indian cities into smart cities with an expense of US$ 1 trillion (approx.). This City, however, is in the works and will take a long time to be the showstopper.


This clichéd of "Smart" has percolated across and well now. Everyone seems to be talking about building smart cities to get rid of the burgeoning urban infrastructure problems such as pollution (poor air quality), traffic, water, electric, waste, and other such critical infra management issues. It's a different issue that most Indian urban townships, barring a few that came up recently disqualify to be converted into fully smart cities due to the following three key reasons:

1.     It's difficult to get rid of legacy and dig up most parts. Stuff that should go under the ground can't be sustained over the ground, which is how the planners make it in some cases.

2.     The urban planning of those cities is archaic and doesn't support the tenets of smart cities of today. It will be a prohibitively costly affair to redo the urban planning and readjust the city.

3.     The citizens are naïve to this whole concept of "Smart City." To them, (at least the most of them), a smart city means just the automation of a few processes like electricity meters, water meters, Wi-Fi They don't understand their own contribution to it.


This brings me to the core part of this blog. I read an exciting piece of news published last year on a portal called "Amsterdam City." To my surprise (and embarrassment too as an Indian citizen), the country believes that in the core of a smart city, there are "smart citizens". I am in no position to write a 100-page consultation paper and an economics white paper on what and how we shall be creating smart citizens but there's one example that actually zapped the whole of me.


There's something called Smart Citizen Kit in Amsterdam. Some of you may be aware of it. The whole debate in India about unbelievably poor air quality and an alarming condition due to that, appeared to me like a joke when I read about the Smart Citizen Kit and how the people of Amsterdam help make their city one of the world's best and liveable places.


The following few paragraphs are verbatim reported from the "Amsterdam City" website:


"The Smart Citizen Kit was devised out of growing concerns of citizens about the quality of their air. The difference between this Kit and other measuring tools is the active involvement of 'ordinary people' in the measuring process. In this project, Waag Society and Amsterdam Smart City want to install a network of sensors all through Amsterdam. The Kit can measure humidity, noise pollution, temperature, CO, NO2 and light intensity. Participants fasten the Kit somewhere outside their house, e.g. outside their window, or on their balcony. The Kit takes measurements and conveys the results though the internet connection of the participant.

Amsterdam has been chosen for a pilot project. This is a test case for the hardware (Can this high tech gizmo withstand the Dutch rains?) its software and the participation of its users. The participants will test whether it is user-friendly enough. 


On March 24, 2014, an Install Party took place for the Kit's Amsterdam participants. The 100 participants received their Kits and and an instruction. All data is shared simultaneously on the internet, so participants can compare their Kit's data with other parts of the city or the world. All data can be found on the project's website.  


The purpose of the project is to heighten awareness of the participants about the quality of their air. With this project, we are finding out on a practical level how this may take place. We can gather information on the results, the technical specifications of the Kit and its impact. And, how we can make the step from Smart Cities to Smart Citizens. 


On the 16th of June 2014 there was an evaluation of the results of the Smart Citizen Kit. The experiences were presented and the participants could share their findings of the measurement experiment. The measurement data and technology have to improve but the attendees were pleased with the initiative and the fact that many citizens are committed to measure the living environment. In addition, a visualisation of the noise measurements has been made. Of each weekday, the average noise level per session is mapped with the audio data of all kits that supplied at least a full week of data."

So, the crux here, without delving into any sort of detail further, is what makes a city smart? The automation that go unnoticed or the smart citizens who make the city smarter and liveable?

Another slapdash yet very relevant mention here: Ericsson, last year did a wonderful report on "Smart Citizens." I read parts of it and learnt how the Internet can be leveraged to help citizens make smart choices.

(This study was conducted online in September 2014 with 9,030 iPhone and Android smartphone users aged between 15 and 69. Respondents were from Beijing, Delhi, London, New York, Paris, Rome, São Paulo, Stockholm and Tokyo, representing 61 million citizens. Written statements outlining the concepts tested were rotated so that each respondent saw two thirds of all services.)

Christine Luby in her blog "Smart Citizens Make Smart Cities" sums up beautifully: "Why is this important? Well, besides the obvious fact that we need to make cities a better home for a majority of the world's population, this kind of behaviour and thinking is driving the development of ICT as a key urban infrastructure, just as important as roads, buildings and other physical infrastructure."

You and I too can add value to a make a city really smart.

Your views!

Can a Company Remotely Wipe an Ex-Employee's Device?

| No Comments

Mobile Security .jpg

One of my favourite sayings about cyber risk is "an ounce of prevention is cheaper than the very first day of litigation." A recent case provides a nice example of exactly what I mean. In this case, an effective BYOD policy could have saved this company tens of thousands of dollars, at least.

When Can a Company Remotely Wipe an Employee's Device?

Consider this question: When is it lawful for your company to remotely wipe an employee's (or former employee's) device that was connected to your company's network and contains its proprietary data?

It depends. If your company has a binding agreement with the owner of the device, such as an effective BYOD (bring your own device) policy, then it should provide the answer. If not, the only way to find the answer is through costly and time-consuming litigation.

The dispute in Rajaee v. Design Tech Homes, Ltd. illustrates this point nicely. In that case, the employee claimed that he had to have constant access to his email in order to do his job. His employer did not provide him with a mobile device so he used his own personal iPhone 4 to do his job.

His iPhone was connected to his employer's network server to allow him to remotely access the email, contact manager, and calendar provided by the employer. The employer and employee later disagreed over who connected the device to the network or whether it was authorized.

The employee resigned his employment and, a few days later, his former employer's network administrator remotely wiped his iPhone, restoring it to factory settings and deleting all the data -- both work-related and personal -- from the iPhone.

The employee then sued his former employer, claiming that the employer's actions caused him to lose more than 600 business contacts collected during his career, family contacts, family photos, business records, irreplaceable business and personal photos , and videos, and numerous passwords.

He asserted claims for violation of the Computer Fraud and Abuse Act, Electronic Communications Privacy Act, and various claims under Texas state law.

The lawsuit was filed in August 2013. Due in large part to fine lawyering by my friend Pierre Grosdidier and his colleagues, who represented the employer, they were able to get the case dismissed in November 2014. While this was a "win" for the employer, that win came at a significant cost.

An Ounce of Prevention ...

Litigation is not only costly, but it is also very time-consuming for management. It results in lost opportunities to further companies' business objectives because finite resources must be devoted to the battle instead of to the company's business. Of litigation, it is often said that the only ones who ever really win are the lawyers representing the parties. That it is usually true.

In the Rajaee case, the employer was represented by a very well-respected "big" law firm that did an excellent job for their client. But, good lawyers come at a price and I am quite certain the lawyers in this case were not working for free. This case was litigated for about 14 months.

There were 43 entries on the court's docket which shows there was quite a bit of activity considering only the documents filed with the court. That does not include the discovery that was conducted (which is not filed and does not appear on the docket) but motions listed on the docket show there were discovery disputes and and the parties were active in discovery.

What all of this means is money -- lots of money that the employer paid in legal fees to get this win. Probably many tens of thousands of dollars in fees. From a practicing lawyer's perspective, that is great because the clients get the win and so do we lawyers!

But the truth is, good lawyers do not want to see their clients waste money so we look at situations such as this and ask, "could this have been avoided?" This helps us in advising our clients on how to avoid such situations in the future.

In this case, were we to have the benefit of 20/20 hindsight and be able to go back in time to advise companies such as this, before the underlying situation arose, yes there was a much better way to go. First and foremost, the company would have listened when told "an ounce of prevention is cheaper than the very first day of litigation."

Then, it would have acted on this advice by taking the following steps:

  • There would have been a conversation between the company's management, appropriate IT and security leaders, and legal counsel to discuss the company's position on BYOD.
  • The conversation would have considered if the workforce would even be allowed to use their own devices.
  • If the answer was "no, BYOD will not be permitted" then appropriate policies and procedures would have been adopted and documented.
  • If the answer was "yes," then the discussion would have continued to address more specifics on how the company would manage BYOD and the many risks associated with it, which are numerous. Focusing only on the particular issues in Rajaee, the discussion would have resulted in the creation and adoption of a BYOD Policy (or another similar policy) that addressed a key issue as a condition precedent to authorizing and permitting use of the device: By connecting the device to the company network or using it for company business, the user would expressly agree that he or she authorized, and would permit, the company to access the device and securely remove its data at any time company deemed necessary, either during the relationship, or after. And, if the user did not make the device available within a certain period of time after demand, the user authorized company to remotely wipe the entire device and restore it to its factory settings in order to ensure that its data was securely removed from the device.
  • For either answer, yes or no, the company would have implemented and adequately trained its workforce on the policies and procedures to ensure they were aware of, understood, and agreed to abide by the policies and procedures.
  • Finally, the company would have documented the implementation, training, and worker's agreement in a manner that could be clearly be shown to a court should a dispute ever arise that involved such issues.

What is the Takeaway?

The lesson here is that all business must now understand that they are operating in the digital world and in that world there are many risks that one would not ordinarily expect. That is why it is important for companies to proactively prepare for and take steps to minimize the risks of doing business in the digital world.

The Rajaee case focuses on just one of the many risks that need to be addressed.

The above steps are a bit overly simplified but illustrate the process of how to address this issue. The company most likely would have paid less money to its lawyers to address the BYOD issue beforehand than what it would have to pay them to prepare and engage on the very first day of a lawsuit similar to Rajaee.

As you think about that, I will leave you with one more old saying, "A smart man learns from his mistakes. A wise one learns from the mistakes of others." 

This blog was first published on the official blog of Norse "Dark Reading". It is reposted here with prior permission.   

Image courtesy: 


Will Internet of Things Sweep the Enterprise?

| No Comments



While the buzz and bubble of Internet of Things (IoT) seems to be on a never-ending roll, I am trying to get answer to a key question:  Will it really sweep the enterprise space?


Two recent developments brought my focus back to this complex yet interesting topic of IoT. First one a recently launched Verizon Report "Internet of Things 2015", which claims with authority that by 2025, best-in-class organizations that extensively use IoT technologies in their products and operations will be 10% more profitable.


It's correct that IoT is not a god sent stuff. Sensors, networking gear, cloud, RFID - technology pieces that IoT rests on - have been known to us for quite a while. What is that IoT has or will change that is catching the attention of not just consumers but also governments now. But, will it attract the Enterprise market?


Although a bit too futuristic buy you'll be surprised to know that by 2030 machine-to-machine communication will account for more than half of all data produced and that's why how important it is to unlock its potential within time.


The second development was IBM's Announcement of investing nearly US$ 3 billion in IoT ecosystem over the next four years. It is a big investment by all means. What will IBM do with it? Well, the company says it will build Cloud-based open platform for industries to leverage IoT data. This new IoT cloud services will drive insights into business operations. The company also claims that over 2,000 IBM Consultants, Researchers and Developers will help enterprise clients reveal new insights. Sounds really interesting but it's a vendor side of story afterall.


What's more intriguing to know is whether IoT is really capable of bringing true value to enterprises in the areas like operations monitoring and management, utility and asset management, HR management, and so on...


The Verizon reports makes an interesitng mention of adoption figures. It says the adoption of IoT has been seen as high as 80% for some applications, but the number of enterprises that have adopted IoT extensively is just 10%. Who are these 10%? What type of companies are they?


Available data suggests that the top 14 automotive manufacturers accounting for 80% of the worldwide market, have a connected-car strategy, which means they are heavily focusing on experimenting with M2M communicaitons.


If we believe in the Verizon report, more than 13 million health and fitness devices will be introduced to businesses by 2018.


Sectors like manufacturing are pinning their hopes on IoT for sure. Finance and Insurance sectors are also taking keen interest but sectors like healthcare and pharma aren't too hot as yet.

An interesting White Paper by Oxford Economics claims that by 2016 nearly 53% of all manufacturers will start offering smart products.

Another interesting Report from Business Insider gives a very convincing answer to my question. It says:


"The enterprise sector will account for 39% of the roughly 23 billion active IoT devices by 2019 - the largest among the three markets home, government and enterprise. Enterprises will spend nearly US$ 255 billion globally by 2019. Popularly Manufacturers are the leaders in using IoT devices and Business Insider estimates their total IoT investment will reach $140 billion over the next 5 years."


McKinsey mentions that certainly a widespread adoption of the IoT is in near sight but "the time line is advancing". This is all being possible because of the improvement in underlying technologies like wireless networking, standardization of communications protocols that makes it possible to collect data from these sensors almost anywhere at any time. 


If the numbers are to be believed, there is a lot going to happen in the IoT space in the next five years. But if we cover these enthusiastic numbers for a moment and turn towards some hard facts why the enterprises are not or won't really go for full throttle on IoT, we can understand the gaps.


Issues like interoperability and cyber security feature in every discussion around IoT. There is also an issue of who owns the data? So, the leadership has to come out clear on the hierarchy.


Chris Kocher, Founder and Managing Director Grey Heron in his blog Internet of Things: Challenges and Opportunities very beautifully mentions about some challenges that stare at enterprises. These are primarily in the areas of privacy and trust, complexity, competing standards, and security. Kocher also says there aren't very many concrete use cases to showcase the value of IoT.


But surely these are initial, exciting days. When the excitement settles and the players like IBM, Cisco etc. bring out stuff that really works, there will be adoption. How much? That is to be seen later.


Image courtesy:

Everything As A Service

| No Comments

Everything as a Service.jpg

Platform-As-A-Service (PaaS), Communications-As-A-Service (CaaS), Infrastructure-As-A-Service (IaaS) and many other X-aaS systems/platforms. In fact, XaaS is being used to denote the 'Everything-As-A-Service' mentality.

The 'Everything-As-A-Service' approach is one that makes a great deal of sense. If an organization can decide they need a new platform or a new application they don't have to undergo a long, drawn-out technology selection project and subsequent implementation project to get the right technology within their data center.


Using XaaS, a company can identify the technology, platform or application they need or want and then simply sign a contract or sign up for service. There's no need to look at data center sizing, efficiency, utilization rates or power requirements to ensure a new platform can be implemented. There's no need to spend months trying to integrate a new platform or application with your existing infrastructure (although integrating XaaS can be just as difficult as non-XaaS is).


Regardless of where you might stand on the XaaS model, there's no denying that the approach provides organizations with the ability to deliver services and systems quickly and efficiently without a large capital outlay to purchase, install and integrate new systems and hardware.


Additionally, the XaaS model allows organizations to build agility into everything they do. From start to finish, everything the IT group does can be agile focused and every project undertaken within the IT group and within the data center can have agility baked into the core of the project.


Another valuable reason to adopt the XaaS approach is that it lets IT group build itself into an "as-a-service" offering. IT as a service is the next generation of IT service to organizations. IT as a service is the model for delivering the right services to the right people at the right time in the right way.


IT as a service is agility personified. With the 'as-a-service' approach, the IT group can quickly deliver any application, system, service or platform to the organization in a way that should allow lower costs, better management and more efficient operations when compared to the legacy approach that was focused on data center centric systems and platforms.


I've written a bit about the agile data center over the last few months. To build the agile datacenter, the IT group needs to become agile themselves. To become more agile, the IT group can adopt the XaaS mentality and models to build and manage agility throughout the organization and within the data center.


This blog was first published at Author's Blog and is reposted here with prior permission. For more blogs from the author, Click Here

Image courtesy:


Cloud Trends.jpg

Sometime ago, I was asked by Atheneum Partners to write a piece for their global newsletter.
The question I was posed was for my take on what are the top cloud trends that will shape 2015? 

My take on it is as follows: 

The usage of cloud-based services continues to penetrate deeper into the enterprise than ever before. The fear factors of security, data control, privacy and contractual exit strategies continue to be tempered by the virtues of cost savings, availability, speed to market and innovation.

If you are evaluating technology upgrades, replacements or acquisitions, 2015 is the year that cements cloud on the list of considerations.

I have detailed below (in no order of importance) what I think will be the main cloud focused trends in 2015 but I would love to hear what else you would add to the list?

Thanks to the price and feature wars between the biggest providers including AWS, Microsoft and Google the market is now more available than ever as organisations now look beyond raw infrastructure for value.

Hybrid Clouds
Gartner broadly defines hybrid clouds as the combination of two or more cloud services coming together to create a unified cloud experience. It can be a mix of private and public cloud services, but can also include combinations that are all public or all private.

In 2015 a blend of on-premise and cloud services is pretty normal but enterprises should adopt cloud services in a tactical way that ensure they're getting the right match and secure model to suit the needs of their organisations. Hybrid cloud is the much-discussed direction that many organisations will ultimately follow.

Hybrid cloud management tools will improve and allow IT organisations to seamlessly administer and operate them securely.

Cloud Operating Models
As cloud services converge with social, mobile and information in what Gartner calls the. "Nexus of Forces", organisations will need to start incorporating cloud operating behaviours in a platform for digital business.

Maturing and Well-defined Cloud Market
The cloud marketplace has matured significantly and moved away from the free-for-all approach of the past couple of years. The global scale cloud providers such as AWS and Microsoft's Azure will continue to operate at the high-end but there will be lots of smaller, more regional, industry focused custom providers to fill in the gaps around them.

Cloud Brokerages
There will be a rise of intermediation services that will seek to help organisations manage and integrate their cloud services. Organisations new to the cloud and those delving in to the hybrid approach will welcome such third-party providers and the niche skills they bring but will need to decide how much they cede control.

Enterprise Workloads Moving in to the Cloud
Amazon's AWS has long been a go to choice for those offering online services but 2015 will see a greater enterprise adoption for not just AWS but Microsoft's Azure and Google's Compute Engine amongst others.

Cloud is the new style of elastically scalable, self-service computing and many enterprises will look to embrace all that it can offer.

Containers Will Gain Momentum
Containers have helped solve many of the problems that the cloud poses for IT operations. Developers love containers but IT operations now need to be able to containerise different parts of an application, locate them in different types of cloud infrastructure, and manage them as discrete units whilst keeping the part acting as a whole.

Compliance and Regulations
As cloud platforms continue to mature, cloud is spurring interest from even those industries that have previously been hesitant. Think of those most beset with regulation, compliance and privacy: public sector, life sciences, financial and health care. Lots of cloud providers will take the necessary steps to receive appropriate industry certifications, creating more platforms designed to align to Sarbanes-Oxley and others.

Internet of Things
Interest in the Internet of Things will build throughout 2015. Positioning clouds and applications for it right now is difficult but if your organisation is moving in to this space you need to be prepared for how to capture and store the potentially large amounts of resulting data. Everything from orchestration to database management tools will need to evolve to better support this area.

Disaster Recovery
Traditionally this has been a problem area for IT but DRaaS enables you to address many previous problems such as testing, the high cost of installing a backup system and accurately mimicking potential issues. I think this will be a growth area in 2015.

With CIO's under constant pressure to deliver innovation and business value whilst continuing to provide BAU services, they are always looking for new ways in which to achieve their goals. Cloud services have often provoked fear in many enterprises due to security, data and privacy issues but with the market rapidly maturing, costs falling, security and services improving could this be the year that cloud thrives?

Christian wrote this article for the Atheneum Partners Newsletter and it is reposted here with prior permission. To know more, visit the author's website:

Image courtesy:   

Smart Buildings - First Step to a Smart City

| No Comments

Smart City .jpg

A Smart City needs smart buildings, smart service providers, smart electricity meters, smart water meters, smart traffic and road safety management, smart waste management, smart security and so on.

What we are caught up between is the existing cities and the new and upcoming ones. Well, it would be easier to incorporate everything "smart" into a new city but what do we do about the existing ones. The question is: How do we connect and bring in all the diverse and proprietary systems - be it heating, ventilation, air conditioning (HVAC), lighting, power, plumbing and water - if only they are deployed in a building - together? Well... the solution is simple - Building Automation System (BAS).


If a BAS is not deployed in a commercial and / or industrial building then there is a need to deploy that in the first instance. Now, in the event the building owners and /or facilities operators have been "smart" and have an existing BAS deployed then there is a need to protect the investment and keep the capital expenditure (CAPEX) at bare minimum.

We should be aware that a standard BAS has its own limitations and can normally manage and maintain individual systems and may not be able to interlink systems and hence not able to leverage the BAS infrastructure to report a status change in any one of the systems. For example, a standard BAS may not be able to modify the operations of a lighting system or a HVAC unit based and thus optimize energy usage and consumption based on the the number of people in an office or a commercial building.

To be truly "smart", we need a modern BAS with basic functionalities including:

·       Device-to-device connectivity and data exchange

·       Remote manageability

·       High reliability

·       End-to-end hardware and software security


The Solution, perhaps, is - to look at Internet of Things (IOT) and incorporate a BAS Gateway that supports multi-communication protocols to provide connectivity between the device networks and the backend infrastructure network to the existing BAS. Furthermore, the BAS Gateway also needs to have a cost-effective monitoring framework.


Today, BAS Gateways are available that can connect to various sensors like a smart parking sensor, office air quality sensor, environment sensor (checks on the CO2 level, temperature, humidity, etc.), energy management sensor, etc. all over a wireless (WiFi) network to the existing BAS. Once deployed, these sensors help measure and realise energy and cost savings for the lifetime of the buildings.

The benefits of this in India is that we do not have too much automation in our buildings and building processes, so we can actually jumpstart couple of steps in our journey and dream to create "Smart Cities." It is possible to "convert" existing buildings into "smart buildings" and "build" new "smart buildings" using Internet of Things (IoT).

Don't forget smart buildings is the first step towards a smart city.

This blog was first posted on LinkedIn by the author.

Image courtesy: 

Converged Infra.jpeg

As enterprises begin their tryst with digital, the pressure to rethink their IT infrastructure is visible. That's because the traditional infrastructure isn't sufficient to keep up with the business's accelerating demands. A digital transformation exercise, including enabling and delivering SMAC+ technologies, requires an agile, scalable, manageable and secure backend.

For IT, this means a change in the traditional approach towards IT infrastructure - separate components/solutions for compute, storage and the networks. The data centre needs to move away from a siloed approach and embrace emerging new models that are not only capable of giving cost and agility advantages but also ensure an enabling underlying platform upon which the digital technologies can optimally work. The practical alternative to it is Converged Infrastructure.

As the gap between business demands and IT starts widening, enterprises usually look at converged IT infrastructure/data centre solutions. There are, however, a few exceptions - mostly greenfield  companies - in favour of an integrated approach (right from the start) to be future-ready.

The White Paper on Converged IT Infrastructure: A Must, Yet not the Top Priority by Core Quadrant is based on a Survey of 116 Indian enterprises. The Survey found that majority of CIOs are still mid-decision on the move to converged IT infrastructure even though they realise that the traditional approach won't suffice to take on the digital onslaught with the speed, agility and flexibility that it demands. The survey also highlights that organisations with a high digital score are more prone to go for converged infrastructure.



I was at this conference of small and mid-sized Cloud service providers who were discussing the current state of the market and evolution with everyone talking digital. They were hoping to collectively brainstorm and learn from each other's experience. They discussed the evaluation criteria they were subjected to, problem statements they had to answer, and the two biggest stumbling blocks that would not go away even with the maturity of the cloud solutions and growing customer base; they are ROI and Security.


Some large enterprises have adopted a cloud first approach to their new initiatives while they seriously evaluate movement to the cloud whenever faced with any upgrade or refresh decision. These early adopters and fast followers now are more or less convinced that it does not make sense to continue investing in conventional hardware solutions. Data centers and servers are best left to the experts to manage while application management was outsourced a decade back. DevOps is the way to go and Cloud is where everything should reside.


Off course there are industries which have seen exceptions for some types of solutions which are still not amenable to be on the cloud. Even the providers acknowledge this and keep away from pitching for such use cases. Big monolithic solutions are facing the agility challenge and the paradigm has shifted to accommodate multiple for purpose apps on the cloud that are making some parts of the big solutions redundant or enhancing productivity by reducing the effort to complete a workflow or task in the conventional solutions.


Consumer and personal apps reside on the same devices that are used at work; this transgression managed or otherwise is here to stay. CIOs and CISOs have learnt that pushbacks are no longer accepted and they have to find a way to make peace and find solutions that allow coexistence. MDM has evolved to provide some level of containerization to separate the official from personal and the ability to brick a device should it be lost or fail to return on exit. So where is the unfulfilled promise of security and ROI or is it just a favorite flogging horse ?


How secure is your cloud solution? Have you had any security certification done for your software? When was the last time penetration test was conducted? What is the uptime offered on your cloud? Clouds are expected to save money; what is the ROI of your solution? The service providers' reality was that they had to field these questions every day with every customer with every opportunity with everyone they met. It was as if repeating the message would strengthen its value and make it work for the customer and stakeholders.


After all the due diligence and certifications, customers then go on and deploy the solution with limited security governance and vulnerable practices that expose the data. Eventually if and when data leakage does occur, the cloud and/or the solution is deemed immature and not upto the mark. Attempting to create idiot proof solutions with all the checks and balances to protect against human stupidity is the final and ultimate step in ensuring that the solution is secure; and this has remained the goal of every enterprise and the challenge for every provider.


Return on Investment is a different ballgame; value is a function of the frame of reference of the perceiver and nothing to do with reality. For someone a dollar a month per user may be value and for another $10 is not expensive. Can service providers do justice to the wide spectrum of expectations ? I am not sure that kind of elasticity exists; volume driven discounts or market entry strategies may offer initially low pricing which is rarely sustainable in the long-term unless the end game is market valuation and not profitability.


At the end of the discussions collective wisdom indicated that alleviating the fear factor will take its time with evolution not being consistent and everyone wanting to reassure themselves of the risk factors. It does not matter how many have taken the leap of faith or how long the solution has been around. Even today there are buyers apprehensive of every decision lest it not work in their unique environment or their inability to leverage the value. I think that the discussion will keep popping up and we will have to reassure a zillion times over.


This blog is reposted with permission from Arun Gupta. To read Arun's Blog, you can visit:


(Image courtesy:


Deloitte has recently released its sixth annual report "Tech Trends 2015 - The Fusion of Business and IT." This outlines the top technology forces with the potential to reshape business models, reimagine customer engagement, and change how work gets done.

Across the private and public sectors, business strategy is being transformed by the rapidly changing technology landscape. CIOs and technology professionals have an opportunity to vet, prioritize and invest in fast-paced technology developments. Deloitte's "Tech Trends 2015" report examines how some of the biggest macro technology forces -- digital, analytics, cloud, the renaissance of core systems, cyber security, and the changing role of IT within the enterprise -- are enabling historic advancements in business, government, and society. It also examines how they are fueling breakthroughs in materials science, medical science, artificial intelligence, and other exponentially changing domains.

Bill Briggs, CTO of Deloitte Consulting and the author of this report says that the report looks at the remarkable rate of IT change and provides an insider's view of what is happening today and anticipated in the next 18-24 months -- across industries, geographies, and company sizes. It encourages IT executives to be the catalyst of change for emerging technologies -- helping the business understand the 'what,' the 'so what,' and the 'now what'.

This year's theme of the report, "The Fusion of Business and IT," is broadly inspired by a fundamental transformation in the way business C-suite leaders and CIOs collaborate to harness disruptive change, chart business strategy, and pursue transformative opportunities. Whether it is a manufacturer looking at printing replacement parts on demand, or finding ways in the health care industry to harness artificial intelligence to improve cancer diagnosis and treatments, the confluence of business, technology, and science is being seen and felt across all markets and industries.

The report offers "Lessons from the Front Lines" highlighting examples of organizations putting the trends to work. It also features a "My Take" section for each trend in which business executive, academic, and industry luminaries share their perspective. In recognition of the increasing importance of cyber security in today's global world, a "Cyber Implications" section has been added within each chapter which explores potential security and privacy considerations for each trend.

Sample trends from "The Fusion of Business and IT" include:

·       Ambient Computing - Ambient computing is the backdrop of sensors, devices, intelligence, and agents that can put the Internet of Things to work. Possibilities abound from the tremendous growth of embedded sensors and connected devices -- in the home, the enterprise, and the world at large. Translating these possibilities into business impact requires focus -- purposefully bringing smarter "things" together with analytics, security, data, and integration platforms to make the disparate parts work with each other.

·       CIO as Chief Integration Officer - As technology transforms existing business models and gives rise to new ones, the role of CIO is evolving rapidly, with integration at the core of its mission. Increasingly, CIOs need to harness emerging, disruptive technologies for the business, while balancing future needs with today's operational realities. They should view their responsibilities through an enterprise-wide lens, helping ensure critical domains like digital, analytics, and cloud aren't spurring redundant, conflicting, or compromised investments within departmental or functional silos. In this shifting landscape of opportunities and challenges, CIOs can be not only the connective tissue, but the driving force for intersecting, IT-heavy initiatives.

·       Dimensional Marketing - Marketing has evolved significantly in the last half-decade. The evolution of digitally-connected customers lies at the core, reflecting the dramatic change in the dynamic between relationships and transactions. A new vision for marketing is being formed as CMOs and CIOs invest in technology for marketing automation, next-generation omnichannel approaches, content development, customer analytics, and commerce initiatives. This modern era for marketing is likely to bring new challenges in the dimensions of customer engagement, connectivity, data, and insight.

·       IT Worker of the Future - Scarcity of technical talent is a significant concern across many industries, with some organizations facing talent gaps along multiple fronts. There are also unprecedented needs for new and different skill sets, including creative design, user experience engineering, and other disciplines grounded in the arts. To tackle these challenges, companies will have to nurture a new kind of employee--the IT worker of the future--who possesses habits, incentives, and skills that differ from those in play today. They will also need to develop new techniques for organizing, delivering, and evolving the IT mission.

Other trends include API economysoftware-defined everythingcore renaissance, and amplified intelligence.

In collaboration with Singularity University, the report includes a section dedicated to six exponential technologies -- innovative disciplines evolving faster than the pace of Moore's Law, whose impact may be profound: artificial intelligence, robotics, additive manufacturing, quantum computing, industrial biology, and cybersecurity.

The full report can be accessed online.

Source: Deloitte Press Release

Elephants Can Dance: Microsoft 2.0

| No Comments

On 21st January, 2015 Microsoft showed the world that elephants can dance. For a company that failed to see and shape the mobile revolution, the event told the world that Microsoft will shape the internet of things and lead the unification of user experience across platforms and devices.


For starters Microsoft got the pricing of Windows 10 right; giving it free to Windows 7 and 8.1 users will at least tempt retail users to try the OS and if MS' promises are valid, migrate permanently. The corporate users while more difficult to budge, the pricing and the one year timeline will compel CIOs to look long and hard at the migration costs and benefits.


Windows 10 also has a hit a number of sweet spots, Spartan is a much needed browser upgrade, Cortana across mobiles, tablets and PC will change the way users interact with computers and universal applications like MS Office will change the definition of mobile productivity, the continuum mode will provide the much craved seamless transition between devices and most importantly Windows 10 has the start menu!


For CIOs the baked in containerization of Windows 10 and control over content encryption and selective access control on applications will become compelling features; add Office to the mixture and Lumia phones suddenly look like very sensible business companions. Finally the lockdown capability that allows corporate policy to determine what applications are secure and have access to machine resources is priceless.


Hololens and Surface Hub are two innovative applications of Windows 10. Hololens is MS' adaptation of augmented reality and changes how users can interact with physical objects creating holographic experiences in the real world. The Surface Hub on the other hand fulfills a here and now requirement by making displays smart. Surface Hub will support digital white boarding, remote conferencing, sharing and editing content on the screen from any device, and a display for large-screen apps.


For anyone tracking these developments it is clear MS is innovating all over again and this time around the users will benefit along with the industry at large.



This blog is reposted from LinkedIn with prior permission from Ramnath Iyer. The views expressed in this blog are entirely his own and do not represent the views of CRISIL, its management or its employees.



Asianet Logo.jpg

Asianet Satellite Communications primarily offers digital cable TV and broadband in Kerala, and unlike majority of the MSOs in the country it does major business in both the B2B and B2C segments. After digitization and being able to address the customers directly through their TV sets, I feel mobility is the next major leap for this sector, especia lly in terms of customer experience. Our decision to implement mobility within Asianet was also directed towards enhancing the experience being offered to the customers.

With the application the customers can lodge complaints or suggestions from their mobile phones. It also supports customer complaint monitoring and reporting, wherein the customer complaints are delivered to the nearest engineer's mobile and the number of the engineer transmitted back to the customer. In co-ordination the service is completed without issues and the customer signs-off with his/her rating of the service quality.

Further, the mobile application allows the Consumer Application Forms to be collected digitally, receipt accepted over mobile and the connection given instantly on the spot.

We have also developed the module for cash collection, but haven't yet launched it due to some logical and security issues, since cash is involved. This will be completed before March 2015. There are various other modules that we propose to launch through this platform.

Measurable Business Impact of Enterprise Mobility Application

Enterprise mobile technology is positively impacting our business in a big way with customer experience being at an all time high. There are several examples of customer experience impact areas. For instance, the customer does not have to wait to get to the call centre agent to lodge complain as it can be done instantly from the mobile at his/her convenience.

Also, being able to monitor complaints over the mobile has helped in reducing the overall Turnaround Time (TAT). Further, the productivity of the engineer is easily tracked based on tracking of the complaint resolution.

The mobile application has also helped the company with new customers. The time as well as the cost of acquiring a new customer has reduced drastically as new connections can now be given instantly on the spot.

All these have contributed in terms of delivering an overwhelming customer experience.

Asian Paints Achieves Better Business Visibility with SAP HANA

| No Comments

With our ever-expanding network of dealers, the amount of data to be dealt also increased exponentially. Analyzing sales trends and other key performance indicators for such humongous data quantities started becoming a challenge with our earlier IT architecture thereby greatly limiting data access, visibility and usability.

Is there any one technology/business/process decision that you took in 2014 that made a "Measurable Business Impact" (MBI)?

One of the key decisions we made was to invest in In-Memory Database technology (SAP's HANA). We decided to take our ERP to the In-Memory platform.

The decision has really helped us to create an IT architecture that allows us to think differently while offering solutions to the business. We are trying to build many innovative apps on top of this platform.

From a pure disaster recovery and other perspectives this became an entirely new skill as the HANA platform comes with its own replication technology. It now allows us to synchronize replication etc. with zero data loss. This was a driver in terms of IT resiliency business imperative.

If you look at our business, we have to directly sell to thousands of dealers, and during month ends a lot of selling happens. Traditionally while we did billing we could not carry out analytics. The need of the hour was to correlate what we sold in the market with the schemes that were available in that particular market. If we could analyze the data then we could cross-sell/up sell to the dealers.

What has been the quantifiable measurable impact of the decision?

While the core ERP has already move to this platform, we are now thinking of next generation applications that can be weaved in for supply chain execution and planning from a common user experience perspective. In short, we are looking to merge analytics and execution on to a common platform.

With this technology we now have the visibility. This allows the business to help the dealers achieve their targets. From a supply chain side it allows them to track the sales across regions.




One of the key initiatives this year was to help our patients get access to lab reports online. The portal has helped us to drastically improve patient experience. The quality of interaction between the patients and doctors has undergone substantial improvement.

Is there any one technology/business/process decision that you took in 2014 that made a "Measurable Business Impact" (MBI)?

One of the key initiatives this year was to help our patients get access to lab reports online.

We had implemented STAR LIMS to support KDAH Clinical Lab. However, it only supported pre-barcoded Vacutainers that resulted in errors, we had limited analyzers interface availability and preliminary result reporting was missing.

In addition, the response time / performance was not satisfactory, multi-layer authentication was not available, and Clinical Lab SLA was not up to the mark.

To address all these issues, we rolled iSoft for implementation of this new system. The report portal is integrated with Lab Information System (LIS) and Website.

Q3: What has been the quantifiable measurable impact of the decision?

Some of the key benefits of the solution have been:


Result entry on-line interface from analyzers


Reduced Manual Result entry to <1 percent


Effective manpower utilization of Clinical lab staff


Manual Result entry done only when Interface is down in rare case


All Analyzer in our Lab are Interfaced in Bidirectional Mode


No Manual result entry thereby reducing manual error to <1 %


Validation of results by age and sex for abnormal ranges


Option to hold results for review before releasing


Customer can download PDF Report from anywhere 

SRL Diagnostics Uses Cloud to Enhance the Customer Experience

| No Comments

At SRL Diagnostics, we were looking to improve our existing customer service experience. We use our proprietary Lab Information System CLIMS for Lab process from sample registration till delivering the report to patient.


The one technology/business/process decision that you took in 2014 that made a "Measurable Business Impact" (MBI)?


In our bid to provide a better and well-rounded customer experience, we needed a scalable future proof customer experience solution which could help us keep track of the customer service history and make the information accessible to our call centre staff.


For this we decided to roll out a Customer Relationship Management (CRM) application for its customer care department. We team partnered with Wipro to provide full spectrum of services around Oracle Service Cloud which included system integration, change management, and application support. We developed a custom built toolbar for call management, integrated to SRL' s existing customer care infrastructure.


What has been the quantifiable measurable impact of the decision?



There has been drastic improvement in call handling times, thereby delivering a consistent service across different channels like voice, email, web, chat and social media. The solution has reduced costs, provided transparency, strengthened security, and total Cloud freedom for mission critical customer experience delivery.


It has helped us to streamline our call centre operations and provided an incident tracking mechanism.



Godrej Moves E-mail to the Cloud

| No Comments

Our vision is to grow 10 times by 2020 (The projection is part of the group's 2020 vision that was unveiled in 2011 during the Godrej leadership forum). To achieve this growth, IT will play a crucial role. We want our IT to be agile and scalable in delivering business services.


Is there any one technology/business/process decision that you took in 2014 that made a "Measurable Business Impact" (MBI)?


We have been one of the early adopters of cloud and we will continue look at the model to provide us with faster and scalable IT services.


One such decision that we undertook was to move our emailing system to the cloud. The servers for our emailing system were due for a refresh. We decided to adopt Microsoft's Office 365 to integrate communications between our group companies across India as well as our international entities. We were aware that the cost of having an on premise solution in comparison with moving to the cloud was almost similar. 


However, the cloud solution would help us to enable growth for all our group companies that have been growing rapidly, both nationally and on a global level.


What has been the quantifiable measurable impact of the decision?



We are now able to scale our emailing system at ease and also support for our end users has improved. There has been a reduction in the data centre hosting and manageability costs. We took the Office 365 suite and along with it we also got Lync Online, SharePoint Online and Yammer which we are also put to use.


Our email is backed up with service-level agreements, performance guarantees, disaster recovery and assured business continuity. 

Serco Delivers Business Insights on the Move

| No Comments

Providing a unique insight into the business is an expectation that senior management has from IT. 

A technology/business/process decision that you took in 2014 that made a "Measurable Business Impact" (MBI)?

We have multi-location delivery centres spread across the globe. Our end customers and clients wanted to know how their business processes were delivered in various parts of the world. This motivated us to develop a mobility solution.

Traditionally, the performance was displayed on wall boards that were later moved to desktops.

We took this a step further and rolled out a mobility platform called Business Pulse. We worked very closely with the business to create this solution. A PoC was first conducted for our internal customers. A part of this solution, for example, the user interface (UI) was developed in-house.

The tool was meticulously planned and created to mobilize our business MI or management information which comes out to through our delivery centres. We built an app that allows end customers to have a look into the core processes that run and support their activities. This is a very innovative tool and probably a first in the BPO industry.

The next step will be to develop MDM capabilities on this solution. It has created a huge momentum. We ensured that this solution goes across different set of clients.



What has been the quantifiable measurable impact of the decision?


Business pulse provides real-time MI to our end customer on their mobile devices in a secured manner. This has resulted in increase of trust, transparency and accountability from our customers. 

Apart from having access to real-time data, customers can also pull out historical reports as and when needed.


Image Source:


Chemicals manufacturing company, Atul Ltd.'s operations are built on two core enterprise business solutions - Oracle eBusiness Suite R 12.1.3 ERP and PeopleSoft 9.1 HRMS. Both the systems have been operational, with over 1,100 customised reports being developed as per the business users' requirements, over the last five years.


As a result, the business data is split into two clearly distinct segments - 'live transactional data' and 'business reporting data' - in the Data Warehouse.


IT conceptualised and implemented a zero-cost, open source MySQL Pentaho Kettle Spoon Data Warehouse. The ingenious decision to opt for an open source solution (instead of going for the conventional Oracle Database-based Data Warehouse) has drastically reduced the TCO of IT infrastructure as well as the application database cost, while giving business users a positive and agile computing experience. Besides, the solution has laid the foundation for reporting and Business Intelligence operations through its database with lower efforts.


Measurable Business Impact of Open Source Data Warehouse


TCO Calculations for Oracle Data Warehouse vs. Open Source data warehouse for a period of three years:


Oracle BI Suite Enterprise Edition (OBIEE) Data Warehouse


a)      One Time Cost

License Fee INR 3.00 Crores

Total One Time Cost INR 3.00 Crores


b)      Annual Recurring Cost

Annual Technical Support Fee (ATS) INR 0.75 Crores to Oracle @ 22% + 3% IAR (Inflation Adjustment Rate)

Total Annual Recurring Cost: INR 0.75 Crores

Recurring Cost for 3 Years: INR 2.25 Crores


TCO for 3 Years (a+b): INR 5.25 Crores


Open Source based Data Warehouse Solution


a)      One Time Cost

Zero cost for one time ownership

Total One Time Cost: Zero


b)      Annual Recurring Cost

One Year Recurring Cost: Zero

Recurring Cost for 3 Years: Zero


TCO for 3 Years (a+b): Zero


Hence, the clear financial savings accrued is INR 5.25 crores. Besides the monetary impact users have also benefited from superior quality of speed and efficiency from this unique and unconventional solution.



With technology emerging as a key driver for business transformation, enterprises across the Asia Pacific region will increase their IT investment on new technologies next year, according to Microsoft Asia Pacific CIO Survey.


The above infographic tells us how CIOs are taking the lead to transform and successfully operate in a mobile-first, cloud-first world.


As per the findings, 62 percent of the respondents said that they plan to increase their IT investment on new technologies next year.


The study finds that CIOs in Asia are grappling with new demands from customers, employees and business stakeholders. In terms of prioritization, enhancing customer experiences; transforming into a digital business; and becoming a more responsive organization is extremely important areas as part of their transformation.



Information is the core business for Bureau Veritas, a leader in testing, inspection and certification business. Thus, for Laurent Serano, its Global CIO, the company's revenues are tightly linked to the ability of its Information Systems. He sees to it that the millions of units of information collected across the operations in 147 countries are utilized through technologies like mobility and big data/analytics to offer greater value to the customers. Serano shares innovations around Information Systems being applied to business and how he has worked to create an efficient, global organization through the Business Value of IT.


Q. CIOs are increasingly under pressure with management expecting them to enable newer revenues streams and contributing to the business. How are you as a CIO enabling it for Bureau Veritas?


Bureau Veritas is into testing, whether it is shirts, toys, electronic gadgets or minerals. Pretty much of what it does is collect information, process it and give it back to the customer. Therefore, information or information processing for us is the core business. Our revenues stream is tightly linked to the ability of Information Systems. Again, a lot of innovation to boost profit margins and deliver innovative offerings to customers is mostly around Information Systems.


Q. Any example of innovation around Information Systems being applied to the core business?


Mobility initiative in our inspection business, which accounts for almost half of the company's revenues, is one such example. All the inspectors on field are equipped to collect and share information, like pictures of assets being inspected, in real time and using any mobile device - basic smartphones to sophisticated tablets. Thus, mobility is the key to the operation and growth of inspection business.


We also hire sub-contractors and the biggest challenge here is giving them access to our system while we have no control over their device. For this we have developed a cloud based mobile app that they can download on their smart phone and connect with our system in a secure manner and be ready to work with us.


Overall, Bureau Veritas is a big user of mobility and well known for that worldwide with almost 40,000 of our people using mobility solutions.


Q. What have been your learnings from the mobility experiences?


It can be quite complex as in mobility you master the device, and these devices keep changing all the time. Again, there is the complexity around managing multiple platforms. So far we have been developing the same application thrice for the three platforms we are using - iOS, Android and Windows. We are quite unhappy about it as it involves a lot of re-work. While we have tested tools claiming to develop the core once and using it for all platforms, it has been disappointing so far. In mobility things are going to be very tied to the device to get the requisite value.


Then, understanding the difference between smartphones and tablets is important. While a tablet with keyboard enables entering large volumes of information, a smart phone is more suited for check boxes and small information volumes. The consideration of which to select should depend on the intended usage. In our case, currently we have half of our mobility solutions on smartphones and the other half on tablets, and these are mostly Windows tablets as they give access to all sets of information. We are so far quite happy with this investment that we have made. We see Windows tablets as the future for B2B for the enterprise.


Q. Where do big data and analytics stand in your scheme of things, considering information is the bread and butter of your business?


Considering we collect millions of units of information across our worldwide operations, a key innovation we have been driving is around putting up powerful data analytics to provide greater value to our customers.


Today, customers are not interested in plain electronic reports of tests or checking real-time progress against lifecycle through customer portals. Rather they are now interested in getting test results in a data-based fashion and analysis on aspects like where they stand vis-a-vis competitors as well as industry standards. Imagine the tremendous value it will be for a retailer purchasing from a particular supplier to know if that one is better or not as good as another other supplier used by competition. As a matter of fact, we are not delivering plain reports to our retail customers in U.S. Instead they log in to their management consoles and get great analytical value out of it on various aspects.

Derek Manky, Global Security Strategist at Fortinet formulates security strategy with more than a decade of advanced threat research. His ultimate goal is to make a positive impact towards the global war on cybercrime. I spoke to him on the security threat landscape, security implications of Internet of Things and the rise of the underworld economy. Glad to present some of the excerpts here:


Q] How do you see the current information security threat landscape? What are some of the exploits that enterprises should be worried about in the coming days?

What I see is a shift in the nature of attacks towards a path of least resistance. It is hard for the hackers to go after a fully updated Windows 8 system or Adobe Reader. So they are looking at alternatives where they can infiltrate a device with ease.

Before coming to India, I pulled out a threat Intelligence report for the region, and what I found was very interesting. In the top ten enterprise attacks, one of the most targeted attacks was on D-Link IP security cameras and Network Attached Storage.

Other platforms or connected devices like printers are also a cause of worry. Printers are an often neglected component in network security defenses. With their increasing complexity, internet connection, random access memory (RAM), integrated disk drives, and multi-functionality, network printers are potential security vulnerability. They can create a channel which is not inspected by the firewall. A malware can be planted on a printer that can be used to access computers.

Apart from these, we now have plenty of new hardware and software that have entered corporate networks. Many of them are not tested and lack presence of product security teams, which makes them vulnerable to attacks. The challenge for enterprises in the next year will be lack of security update mechanisms from such software and hardware providers. Intrusion prevention will be the key to block such attacks.


Q] The Internet of Things (IoT) promises great things with its profusion of connected devices, but it also brings with it significant risks. What according to you will be the security implications of IoT?

With IoT there will be more attacks as it is a low hanging fruit for the attackers. Heartbleed and ShellShock are just the tip of the iceberg, and there are many such exploits out there that are not yet discovered.

IoT is a potential management problem. Think about traditional security. Enterprises have secured their end-points with anti-virus and other solutions. With IoT there will be many non-user endpoints, content in the form of objects, and content generated by objects without direct user involvement. There are no anti-virus solutions available for your refrigerator or printers or other devices. This is a big challenge.

We are tackling this by what we call as a platform agnostic inspection. With the rise of IoT, network-based security will become more important than end-point security.

Q] Moving forward, what is Fortinet's strategy to mitigate emerging security threats?

We are very strong in our Next Generation Firewall offering. We were the first network security vendor to deliver a firewall that exceeds 1 terabit per second throughput performance. So we have the processing power, and the capability that enables our Global Security Team to apply all the intelligence gathered in real-time.

I want to highlight, that with the rise of APTs, everyone needs to get deeper into content inspection, as the hackers are using several evasion techniques. Our approach and strategy will be to combine the threat research with our solutions to mitigate security attacks.

In short, we will primarily focus on two things. One is customer protection, where we use our intelligence and research to update all the devices.  The other is taking this intelligence and working with the industry like the Interpol and other law enforcement agencies.


Q] As a cybersecurity expert and someone who is involved with several threat response and intelligence initiatives, can you share some insights into the underground economy/cybercrime?

I have done a lot of research in this area. Cybercrime is big business, and it is growing in scope and impact. It is over US 500 billion dollar industry in terms of the damages caused. In comparison, the IT spend on security is around US 30 billion dollar. The resources and the number of people on the bad side are huge. There are people who offer consulting on money laundering services, botnet rentals etc.


In the underground economy we have something known as crime-as-a service, then there are the producers of malware or exploits, and finally the organizers with deep pockets who run such types of program to infiltrate systems.

We are seeing 600000 hacking attempts in a minute on our global network. There are many instances where hundreds of attackers are working on the same virus to plant it on a system. Each one has a different method, and uses a different channel for carrying out such an attack. It is very difficult to inspect all the channels.


The number of enterprise-built mobile apps continues to rise. However, the time and money it takes to build them, and a lack of a mobile strategy, are holding companies back from making the most of mobile.


As per a study commissioned by enterprise Backend as a Services (BaaS) provider Kinvey, lack of strategy is one of the key factors why mobile apps could take up to a year to develop.  This in turn is preventing businesses from creating apps that disrupt or seriously change business processes.


The report found that with so many CIOs frustrated by the time it takes to develop a single app, 63 percent stated that they will look at cloud solutions to address their needs. In addition, 54 percent will standardize the development process while 42 percent consult with external experts.


To read the full report, please visit:



The Cloud: Gateway to Enterprise Mobility

| No Comments

Can you remember what it was like to do your job ten years ago? For the most part, you were stuck to your desk and most likely using your large desktop computer with a very large CRT monitor. If you were lucky, you might have received a laptop that let you move around the office and/or travel for business and drag along your lightweight laptop. Of course, these lightweight laptops weren't really that lightweight but they did let you know work away from the office whenever you needed to. Lastly, those that were really lucky might have a BlackBerry device to keep on top of their email.


The world of mobile ten years ago was one that was mobile, but wasn't. Sure you could get away from your desk but you weren't always able to do everything you needed to.   The security issues that existed were fairly minor. Companies setup virtual private networks (VPN's) to allow access to the company systems from outside the firewall. Access via BlackBerry devices was fairly secure and straightforward. Access other than VPN or BlackBerry was generally unavailable outside the office. There was rarely thought given to other mobile devices other mobile access and very rarely the capability for employees to bring their own devices into the corporate environment and network.


Today, mobility is much different in most organizations. Laptops are more than ubiquitous and people are regularly working from inside and outside the organization's firewall. In addition to laptops, tablets and smartphones are almost as ubiquitous throughout organizations today with many being personal devices brought into the organization from employees.

IT operations and security have their hands full with the various mobile devices and mobility requirements placed upon them by the wants and needs of the organization. Many organizations have mobility solutions and systems on their 'to do' list to ensure mobility is implemented in a way that allows the organization to meet their goals and objectives.

According to a recent report by IDG Research, 54 percent of organizations have a mobility strategy and have begun some form of implementation while 38 percent of companies are currently formulating a mobility strategy. Those are fairly good numbers that show most companies have identified mobility as a key issue for the future and are trying to address it head on. While these companies seem to have a strategy and implementation plans, both need to be flexible during and after implementation to incorporate new technologies and systems.

One of the main focus points for mobility strategies for organizations should be the cloud. The cloud is a game-changer when it comes to mobility strategy and implementations. The cloud can help off-load already over-burned data centers and IT systems as well as helping to make the bring-your-own-device (BYOD) paradigm a reality with very little overhead within the organization.


Enterprise mobility is a necessity for organizations today, tomorrow and into the future. Using the cloud to help facilitate mobility is a no-brainer today as it allows for mobility to exist from the very beginning of the system lifecycle.   With proper planning, the cloud can bring the ultimate mobility to an organization while still provide great data security and data access from anywhere in the world.


How well is your organization planning for mobility within your enterprise?

This blog is re-posted with permission from Eric D. Brown. To read Eric's blogs, you can visit:


 (Image courtesy:

Designing the Sandbox for Enterprise Innovation

| No Comments

In one of my posts (IT Challenges for the Modern Enterprise), discussed issues confronting an enterprise and how they translate into the need for building innovative applications, and here is an outline the sandbox that could meet this need.


While it is easier to understand the need for innovative business applications, enabling their development in an enterprise is not easy. Let us explore four major factors that shape an enterprise innovation platform.


Speed: An enterprise innovation platform should enable applications to be developed quickly, integrated with data sources or systems of records in the company, and deployed for ready use by business. While speed of application development is important, it is also necessary to get the user experience and scalability right - two traditional time consuming elements.


Ideation: Innovative applications need ideas, inputs, and participation from many different people in the enterprise, even during stages of development. While most developer tools are not accessible to business users and those beyond IT, a collaborative environment provides an opportunity to the company to innovate faster.


Involvement: Apart from critiquing and suggesting improvements post facto, it might be good to involve business users early on in application building. The innovation infrastructure should provide a way for "citizen developers" (business-user-developer) in the company to compose applications using thin software staff.


Evolution: Really, no one knows exactly, and it cannot be pre-determined, what applications are needed for the company to succeed through innovation. At times the specs are unclear to start with or can only become clear when the application (and its underlying intent or strategy) is used in the market. That means, the innovation infrastructure must be designed for multi-round evolution of application with zero hindrance to change and always open to chance.


The Sandbox should obviate 'Shadow IT'


Business users and software developers are not happy at the pace of innovation in their companies and find it hard to understand numerous processes and limitations that prohibit them from moving faster. In some cases, this results in 'The Shadow IT'. While the intent may be good, the path may prove not to be.


Internet is full of stories of things gone wrong with Shadow IT. The worst of all is when sincere efforts at innovation end up in wasted resources and frustrated talent.


This necessitates an IT-sanctioned platform that not only encourages enterprise innovation but also makes innovation application projects effortless as much as possible.


How the Sandbox makes it easy




The speed and ease of development of apps is the key to enterprise innovation infrastructure (the Innovation Sandbox). Replace traditional methods of long-drawn software projects (possibly through an RFP process) with Rapid Application Delivery (RAD).


Speed of development is important, but not at the cost of user experience of the application. This is important, as every user, including employees of the company, are influenced by the great apps of the consumer world.


Therefore this sandbox environment must enable wider section of the enterprise to participate in building applications and not be limited to highly professional developers.




Any enterprise application has to integrate with important data stored in the company's System of Records (applications that hold key data) and perhaps with Systems of Differentiation (applications that automate processes).


Enterprise application integration is achieved rapidly when such integration is done through Application Programming Interfaces (or APIs). The simpler architecture of REST APIs and learnings from the consumer web are finding widespread adoption in the enterprise.




Building an app for scalability is a challenge that can slow down any development team. Gradual decline is hardware pricing and increasing power of commodity hardware means developers no longer need to build software for hardware scarcity, they can assume a scalable platform underneath. The sandbox has to enable designing for the cloud, where applications are built to horizontally scale (through addition of computing resources as needed at runtime, without the need for software changes).


An IT-administered private cloud (under IT control, flexible on-demand provisioning and enterprise data-safe, hence private) needs to power the enterprise innovation infrastructure to remove dependence on IT for provisioning computing resources and reduce wait times for apps to become available to business users.




Innovative applications cannot be conceived and built in isolation. The wider the participation, the higher the chances of innovation in the enterprise. Collaboration tools integrated into the development process improves visibility of apps being built, API design and coordination within the team as well as outside.




While the sandbox should be a simple, fully-integrated environment, the task of managing such an environment is obviously complex. Doing this complex job of configuring, managing and monitoring this system, needs powerful and integrated management tools.


Sandbox means freedom

Sandbox image.jpg

The Sandbox for enterprise innovation (new piece of software running on IT-sanctioned environment) should provide the essential freedom from issues that confront today's enterprise app development teams. The picture above captures the essential ingredients of such an infrastructure. Clearly, to think outside the box is to think rapid development, to think Cloud and rapid integration through published APIs.


As Peter High, the author of "World Class IT: Why Businesses Succeed When IT Triumphs" says "Infrastructure excellence separates proactive organizations from reactive ones." Creating an innovation sandbox for their enterprise is fast becoming an important responsibility of the CIO office.

This blog first appeared on LinkedIn and his published with prior permission from the author.

How Modi Sarkar has Already Accelerated SMAC Adoption in India

| No Comments

The new Government of India has been credited for a number of things including changing the era of coalition politics, building relationships with neighbouring countries, fixing the policy paralysis and so on. However, increased SMAC adoption is not something that even the staunchest supporter will ascribe to the current government! Then why am I swimming against the tide and saying the opposite? Let me explain.


Using Social Media to get the pulse of India. Anyone not living under a rock would have received candidate profiles and other campaign material on their Facebook wall, mail inbox and even on their WhatsApp, but what few people know is that the Modi Sarkar used sophisticated social media analytics to gauge the public reaction to its choice of candidates, position on issues, impact of specific campaigns and even correct mistakes before it could snow ball. Identifying influencers, targeting them with appropriate content and gauging their feedback was done in conjunction with SAP and InMobi.


While this is in the realm of speculation, I can go out on a limb and claim that the Social Media was also used to run smear campaigns. How else can you explain a sudden spike in the number of jokes and gory details of scams surfacing just before the elections!


Using Holograms to address the masses. Traditional rallies could never hope to reach the masses, especially if one person addressed all the rallies. To overcome this, the Modi Sarkar used holographic technology to address multiple rallies simultaneously across India. A hologram of Modi was projected onto mobile stages helping the team organize a staggering 3500 - 4,000 rallies in just 45 days. The impact of this was staggering and the outcome was a Modi Sarkar wave in the Indian heartland.


SMS in Lingua Franca. Using sms for campaigns are not new, neither is communication in local languages. But running a sms campaign in the local language and tailoring the communication based on demographics was what differentiated the Modi Sarkar campaign.


There are differing estimates on the campaign spend with the official cap is USD 75m for a single party along with its allies. The rumour mills however suggest a figure upwards of USD 2.5 billion as the campaign spend of each party. Even if we assume only quarter of this spend was on technology, we have already seen a USD 500m technology spend by the current Government on SMAC, even before being voted to power!


While the Modi Sarkar is yet to roll out any large scale initiatives for e-governance or push any technology initiatives, the election campaign itself has ensured that (SMAC) Social media, analytics, Mobile outreach programs and push marketing in local languages based on Analytics of demographics are invaluable marketing tools, not just for election campaigns but for any large scale outreach program in India.


This blog is reposted from LinkedIn with prior permission from Ramnath Iyer. The views expressed in this blog are entirely his own and do not represent the views of CRISIL, its management or its employees


 (Image courtesy:

Building a Data Culture

| No Comments

Many companies want to 'do' big data today. They're spending money on systems, software, consulting, training and other services to be able to capture, process, analyze and use data. Those are all things that need to be done to be up their data science capabilities and skills. Companies need the right platforms, the right systems, the right people and skills to be able to properly analyze and use their data.

There's one area that many organizations fail to address when building up their data analytics programs and skills. That area involves the corporate culture. Specifically, it involves the culture around listening, curiosity, investigation and willingness to try and fail.

Corporate culture can play a huge role in the success or failure of data analytics programs. If your company's culture doesn't like hearing new data that may provide conflicting information, your big data initiatives may be set up for failure from the very beginning.

In my experiences, the ability to listen and act on new data is one of the most important aspects of corporate culture that leads to success with data analytics and big data. If you don't have a corporate culture and leadership team willing to listen to new information. For example, if your CEO doesn't listen to data or arguments that go against her beliefs, you may be in for a very difficult time if your data analysis shows a reality different than the one that she expects or wants.

While listening and accepting competing arguments and data is the top cultural issue that can make or break big data, the other cultural aspects are important as well.

For example, if the people who are working with your data aren't curious about the data and willing to spend plenty of time investigating that data then you may be wasting money giving those people the proper skills to become a data scientist. You may be training them to act as your data scientists, but if they aren't interested in finding out more about your data and investigation new avenues of analysis, you may not get the move value from them or your big data initiatives.

Lastly, your corporate culture should be willing to accept failure. Now, I'm not saying you should embrace or excuse failure, but many times in the data analysis world you end up finding analyses that don't match with your expectations. Much of the time spent by data scientists is spent in small analysis projects looking for new ways to look at data. Many of these small projects end in failure with nothing of measurable value to show for the time spent on that project. Even though it may seem like wasted time, these types of projects are what make great data scientists as it allows them to continuously improve on their skills.

Successfully implementing big data initiatives is much more than just buying some software or systems. Successful big data initiatives require working on soft skills as well as organizational culture to ensure that the big data mindset is ingrained throughout the organization.


This blog is re-posted with permission from Eric D. Brown. To read Eric's blogs, you can visit:


 (Image courtesy:

BYOD - Freedom and Responsibility

| No Comments

Mobility is a multi-faceted asset, but some of the value it adds may actually be a cause for concern, especially in terms of the fine balance between rights and freedom.

This would hold especially true when it is about mobility at the workplace. So one of the biggest challenges technology teams in any enterprise face, is how far to go and where to draw the line.

To ensure a good work-life balance, enterprises today are definitely allowing personal spaces to encroach onto the office time, and in many cases that is unavoidable- just as office work almost always takes up personal space as well. 

The flexibility to carry personal mobile devices at work will to some extent send out a message that the organisation does appreciate the concept of work-life balance. But while we do believe in allowing mobile devices at work, there are a number of issues that need to be resolved at the end of the day.

Common knowledge is that security is a big issue with the Bring Your Own Devices (BYOD) policy. But we believe the solution is not getting rid of it altogether, but actually setting processes and policies in place that will control the usage and definitely the malpractices or security risks.

The big concern that global companies face is that employees are averse that their personal devices are being monitored and controlled.

While security policies may be effective enough for the present, the proliferation of mobile applications in every sphere of activity leaves constantly evolving gaps in security. What then is our solution? Obviously, as a team that supports the IT and IT security policies of the organisation, we need to ensure complete awareness of the risks involved, and the procedures that will keep enterprise data safe, even in personal devices. They may go beyond the mobile phone or the laptop.

The issue of all around holistic security is always a challenge. Enterprise control on corporate data residing in the mobile device will always be the requirement with a BYOD policy in place. But while enterprises have the potential to access personal data; they may not enforce scrutiny, as a mark of trust.  It is time that employees also have a deeper insight about the balance of BYOD with their corporate responsibility, and a commitment to keeping enterprise data uncompromised. Every employee also needs to feel responsible for the security of corporate assets and information. We bank on this sensitivity, but keep policies in place.

That is the only way to truly have equilibrium between work-life balance, and the responsibility of enterprise security.

Ultimately, the balance has to be about responsible ownership, both by enterprises as well as employees. And in this, enterprises may be more inclined to have some clear policies in place...and also not be shy of enforcing them.


(Image courtesy:

Good hotels have affable staff to help you at check-in--but those with the best service take a step further.

For The Oberoi Group that has around 13,000 employees and operates 31 hotels as well as luxury cruisers on the Nile in Egypt and the sea off Kerala in India, technology has arguably been one of the main driving forces.


Ever since it embarked on its virtualization journey, the Group has seen vast improvements in internal operations and guest service quality. The hotels have not experienced any downtime, allowing employees to focus on guest services such as checkout and bill reconciliation, and completing corporate sales without being concerned about the availability of desktops and core IT systems.

"By 2010, The Oberoi Group had extended its business from largely building and owning hotels to include managing properties. Our corporate IT department undertook an initiative to evaluate technologies that could centralize administration, reduce post implementation operating expenses, lower hotels' IT-related heating, lighting and other power costs, and improve service quality," says Ashish Khanna, Assistant Vice President Corporate IT, EIH Ltd (a member of The Oberoi Group).

The IT team's first move was to implement virtualization at three of its newest hotels viz. Oberoi, Gurgaon; Trident, Hyderabad and Oberoi, Dubai.

"The fact that we were developing new infrastructures for these hotels gave us a tremendous opportunity to innovate," says Khanna. "We wanted to implement technologies that could meet our current requirements and our future needs."

The IT team tested its critical guest-facing and back-office applications on a range of solutions. "We found VMware Horizon View and VMware vSphere to be more mature in functionality and performance than other solutions," recalls Khanna.

This involved implementing VMware vSphere with ESXi to run core business applications, and VMware Horizon View to deliver virtual desktops on thin-client terminals to back-office employees.

"The Oberoi, Gurgaon project was one of the first virtual desktop deployments in the hospitality sector in India," says Khanna. Each project was completed in around 60 days.