Topics of Interest Archives: Cloud

The Business Analyst’s Next Frontier: Advanced Analytics

The Next Frontier in BI We have posited in prior blogs about the increasing ubiquity of features and functionality that were once differentiators in the BI and analytics space. That is to say, cloud, mobile, and self-service access to reports and dashboards has moved from the realm of unique competitive advantages to table-stakes.

While certainly there are battles yet to be won on pricing, user interface, extensibility, data management, customer service, and a host of other factors, one of the areas of most intense competition is happening around advanced analytics. The term “advanced analytics” itself can encompass a variety of undertakings from geospatial analysis, graph data analysis, or predictive modeling.

The upside of adopting such capabilities is that organizations have the opportunity to get a lot more out of their data than what they might from just reporting on it via dashboards. For instance, undertakings that improve forecasting, reduce customer churn, reduce fraud, or optimize production capacity all can have meaningful impact on a firm’s bottom-line.

In this case, the Holy Grail is figuring out how to give regular business analysts the ability to perform analysis that was traditionally reserved for specialized teams of data scientists. Just as desktop data-discovery tools and self-service BI helped to democratize access to data throughout the company, vendors are hoping to extend advanced analytics to a broader population of business users. The result has been an arms race as vendors have gone on a frenzy to build, buy, or partner their way into the conversation.

The constraint for many organizations has been the price tag. Even if the explicit cost of an advanced analytics solution is low through integration with something like the open source R language, the investment of time and personnel resources is often still significant. Simply put, advanced analytics expertise is expensive. Providing advanced analytics capabilities to business analysts provides value in two directions. For companies without prior investments in advanced analytics, they can now attainably perform basic forecasting and modeling that they otherwise could not. For companies already with investments and teams of experts, it means that lower-level and less-complex requests can be pushed down to business analysts, while data scientists are freed up to take on more complex challenges.

Business decision-makers evaluating their next BI and analytics investment should consider to what extent more advanced analytics capabilities are built in. Mega Vendors such as IBM, SAP, and Microsoft have responded by releasing freemium offerings that allow business analysts an accessible means to try their hand at these capabilities.

To this end, IBM Watson Analytics has taken an impressive leap in integrating forecasting into a visual interface that masks the complexity of the underlying SPSS functionality that is required to perform the analysis.  From a user experience perspective, Microsoft’s release of PowerBI takes a similar approach in that it integrates natural language processing so that users can ask questions of their data in a code-free environment. The cloud-based model and low cost also further extends the accessibility to business analysts.

In a similar vein, SAP Lumira is working to the convergence of data discovery and advanced analytics through continued integration of its Predictive Analysis suite. After acquiring KXEN in 2013, SAP has prioritized its assimilation into its overall analytics portfolio, the end goal (and important differentiator) being that a business analyst using the Lumira interface will have access to advanced analytical functionality with enterprise grade data governance of SAP and performance of the HANA platform.

Too Big Similar Button

Coming from a different angle, there are a few emerging advanced analytics players, such as Alpine Data Labs, Alteryx, and RapidMiner that are quickly introducing code-free environments with powerful capabilities. The blending, modeling, and automation capabilities of these companies holds up even at massive scales, making them important tools for augmenting the capabilities of data scientists and normal business analysts alike.

It is important to note that what we are talking about, in many of these cases, is an extension of the business analyst’s capabilities. We are broadening what can be accomplished if we take BI to its next logical extension. I’m not suggesting that these functionalities can displace truly advanced analysis across complex time series, data mining, multivariate environments, or Big Data sets from sources like machine sensors. Rather, businesses are demanding greater functionality on top of core BI & analytics capabilities and that the vendors who deliver on this will stand to gain.

There will be inevitable pushback as more advanced functionality gets placed in the hands of users that are not formally trained data scientists. Making business decisions on shaky analysis is a dangerous proposition, and no doubt many will wonder if giving functionality to those who don’t know how to use it might be akin to “giving them enough rope to hang themselves.” This is largely a paternalistic viewpoint that will taper off in the same way that similar fears about desktop data-discovery did.  Vendors will continue to build out guided user experiences to assuage these fears, and organizations will put into place safeguards to ensure the quality of underlying analysis. Ultimately, signs point to a future where barriers to advanced data analysis will continue to be lowered, and the capabilities of the data analyst will continue to expand.

Posted in Analytics, Blog, General, General Industry, Research | Tagged , , , , | Leave a comment

Microsoft’s Power BI Will Transform Enterprise BI in 2015

Microsoft Power BIMicrosoft announced on January 27th that it is planning to make its self-service BI solution, Microsoft Power BI, available for a free preview to any United States-based user with a business email account. Microsoft also provided a preview of Power BI for iPad and is planning to create iPhone, Android, and Windows apps for further mobile BI support.

In addition, Microsoft plans to make the newest version of Power BI available as a free service when the general availability launch occurs. There will also be a Power BI Pro offering available, so that Power BI will, in effect, become a freemium service similar to existing Microsoft Skype and Yammer services. The Power BI Pro offering will increase data from 1 GB to 10 GB per user, accelerate streaming data from 10 thousand rows per hour, to 1 million rows per hour, refresh data on an hourly basis, and provide embedded data management and collaboration capabilities

To prepare for this pricing change, Microsoft also plans to change the current price point for Power BI to $9.99 per month, which represents a significant reduction from the current price points for cloud BI.

There are several key ramifications for enterprise BI that Blue Hill’s community must understand immediately:

1) Microsoft is taking a freemium approach to BI with the goal of owning the end user market. This is a strategic approach based on Microsoft’s view of the new, end-user and consumerized view of software purchase and acquisition and demonstrates Microsoft’s willingness to commoditize its own price points and products for the long-term battle of winning cloud and end-user mindshare. Microsoft has learned to execute on this freemium model from several key consumer and freemium investments over the past decade: XBox, Skype, and Yammer.

In pursuing XBox revenue, Microsoft has had to learn the consumer gaming and media market and has gained a deep understanding of demos and consumer advertising that it previously lacked. In addition, Microsoft’s acquisition of Skype has led to Microsoft’s management of a free service that has fundamentally transformed communications and actually led Microsoft to even change its enterprise communications service from “Lync” to “Skype for Business.” And, finally, Microsoft’s acquisition of Yammer was initially seen as confusing, given how Yammer directly competed against collaboration stalwart SharePoint. However, as Microsoft has continued to execute on the development of Skype and Yammer and started integration between those services and Microsoft’s traditional Lync and SharePoint services, it has become obvious that Microsoft is willing to compete against itself and to take on the challenging transformational tasks needed to compete in a cloud and mobile world.

In this regard, Microsoft is actually in a less challenging situation with Power BI in that Microsoft never fully invested in creating or buying a Business Objects, Cognos, or Hyperion BI application suite. This means that Microsoft is able to position itself for a cloud BI world without having to directly compete against its own products. At the same time, expect Microsoft to bring all of its best practices from XBox, Skype, and Yammer to support a freemium model and agile development that have led to the success of these other more consumerized products.

2) Microsoft is also planning to commoditize the mobile BI market. With the impending launches of Power BI for iPhone, Android, and Windows, it is difficult to imagine mobile BI as a premium product going forward, at least in terms of pricing. Mobile BI is already basically table stakes from an RfP perspective, but high-quality mobile BI will now be necessary even for free and freemium BI offerings. In 2010, mobile BI and high quality visualizations were key differentiators. In 2015, these are just basic BI capabilities. Companies seeking differentiation in BI will increasingly look at professional services, vertical expertise, and the ability to eliminate both implementation time and IT support to reduce the basic total cost of ownership for BI.

3) Cloud BI pricing to compare apples to apples is becoming more difficult. Although Power BI’s current and intended pricing models are fairly straightforward, one of the challenges in cloud BI is that every vendor provides a different set of resources and capabilities to support its on-demand and subscription services. As a quick example, consider how Power BI will compare against Jaspersoft, which provides BI services on an hourly basis on Amazon Web Services.

Power BI will provide its Pro service at $9.99 per month, or basically $120 per year. A variety of cloud BI services such as Birst, GoodData and InsightSquared could come in at about $1,000 per year per user for a standard out-of-the-box implementation. In contrast, Jaspersoft supports an annual instance on AWS at 55 cents per hour on an m3.medium EC2 instance, which equates to about 4 GB. This adds up to about $3,750 per year. So, is this a simple comparison?

see_related_research_button

Consider that Power BI provides a standard BI environment, but will not be customized out-of-the-box to immediately support standard data sources such as Salesforce.com. Birst and GoodData will provide various levels of integration, data structures, and vertical expertise to their implementations while a sales analytics specialist such as InsightSquared could potentially implement a new solution with Salesforce data in a matter of minutes. And Jaspersoft’s offering will be better suited for an embedded BI solution because it provides no user limits. So, even with this impending price war that Microsoft will drive, companies will still have to carefully define their BI use cases and select potential solutions carefully. However, Blue Hill expects that standard BI capabilities on a user-specific basis will become as easy to access as Skype, Yammer, an XBox game, or Facebook (another Microsoft investment).

In 2015, Microsoft will shift the fundamental BI purchase decision from “How much does it cost to get BI?” to “How much will it cost to get BI expertise that is aligned to our organization?” The answer to the former question could well become “Microsoft.” The answer to the latter question is where other vendors will need to compete.

Posted in Analytics, Blog, General, General Function, General Industry, Research | Tagged , , | 2 Comments

Taking Stock of the Legal Cloud (2/2): Paths to a Secure Legal Cloud

VictoryPreviously, I observed how the evolution of the cloud has led to considerable growth in cloud solutions within legal environments. At the same time, concerns about the security and privacy of cloud environments have created obstacles to adoption among the profession. For the legal community, the contradictory opportunities and risks presented by the legal cloud results in a tension between attitudes that, at their extremes, we can refer to as “cloud complacence” (or an uncritical trust in cloud providers) and “cloud anxiety” (an uncritical refusal to consider cloud solutions). Cloud-complacent and cloud-anxious attitudes both work, in effect, to increase law firms’ vulnerability to risks, on the one hand, or to deprive them of the real benefits of a cloud solutions, on the other.

Part of the problem is that both cloud anxiety and cloud complacence stem from very reasonable responses to cloud computing. It is not unreasonable to believe that cloud providers (who by the very nature of their expertise and business models) will invest in the security and integrity of their solutions, generally with a sophistication that is lacking at law firms. Nor is it unforgivable to feel uncertain about the sufficiency of these efforts, particularly given some high profile incidents which have erupted over the past year. In fact, for a reasoned articulation of (and response to) cloud anxiety, see Sam Glover’s take on Lawyerist. The trick lies in understanding how much trust or suspicion (or both) is reasonable to find a way that balances the risks and benefits of the cloud. This requires understanding the nature and sensitivity of the data that you putting into the cloud, and how a particular solution protects and potentially exposes that data.

There are a several relevant factors to consider here. First, a basic understanding of what’s involved in data security when using mobile-cloud (the successor to the endpoint-server paradigm):

1) Servers – Generally, cloud offerings transfer data that was held on dedicated hardware physically located within the walls of the firm to remote, shared servers controlled by third parties. What those third parties do to protect and maintain the integrity of these servers is thus an important aspect of cloud security. It is also the most obvious element to consider. Other important questions here relate to multi-location failure, and the extent to which server space is shared or dedicated.

2) Transfer – For the cloud to work, the data and applications stored on remote servers must be accessible by users through their computers and mobile devices. How this data is exposed or protected in transit between cloud servers and these access points is also a crucial element of the overall security of the cloud. The core questions here typically relate to identity encryption and secure data transfer.

3) Access Points –One of the advantages of the cloud is how it opens up the freedom to access data from a wide variety of devices and locations. This also increases the opportunities for exposure. Many devices automatically log into cloud systems and save local copies of the files stored on the cloud servers. As such, we need to be concerned with the security of the device itself, as well as the ability to control it after it leaves physical possession of the firm. The security literacy of users is often an important element here as well.

Different providers take different approaches in how they address these needs, leaving firms with a range of options to consider. Let’s look at a few basic approaches to provide some context for these strategies, and what they mean for your firm’s use of the cloud.

Showcasing Fortification

Again, we’ll start with the obvious option. Many legal cloud vendors have responded to the market’s concerns by improving encryption and server security. The need for strong security has prompted vendors to use security efforts as a matter of differentiation. Key factors here are the security certifications and protocols used by the cloud provider. Firms with dedicated IT resources can suss out the meaning of the terms that are used in these environments, but smaller firms often lack the background to translate the terms and standards referenced into a practical understanding of how secure it will be.

While a little self-education is a healthy thing, vendors often opt to use a number of shorthand tricks to signal the trustworthiness of their platforms by highlighting the:

- Number of certifications obtained. For example, cloud practice management provider Clio highlights that it possesses three certifications (by VeriSign, TRUSTe, and McAfee Security), even if the standards themselves are somewhat redundant, primarily verifying the use of Secure Sockets Layer (SSL) encryption (although the TRUSTe certification also identifies incorporation of its privacy standards).

- Adoption of known security standards or industry requirements. For example, Box and Microsoft Matter Center for Office 365 both underline their compliance with HIPAA and EU security and privacy standards as a way to indicate their appropriateness for legal environment. (Microsoft also lists ISO 27001 and Federal Information Security Act compliance, and goes so far as to identify its own security expertise as a consultative value-add for legal customers.) MyCase (which leases cloud space from Amazon Web Services EC2) and cloud ediscovery provider Logikcull both take pains to identify that they leverage “bank grade” security (which again is largely SSL).

- Physical security at data center sites. Box and MyCase highlight the physical security and disaster precautions of their data centers. Kroll Ontrack goes further, identifying steps taken to ensure temperature control and power supply redundancy.

Moving to a Private Cloud

Generally speaking, when we refer to cloud offerings (in the legal sector or otherwise), we are speaking of “the public cloud,” or cloud resources that are available for public use. On public clouds, server space is shared, and an individual user’s data might be distributed across multiple servers and data center locations. In this way, public cloud offerings maximize the economies of scale that supply the cost advantages of cloud solutions, and can potentially create exposures and a lack of transparency regarding data location and control.

Private clouds represent an effort to avoid the latter issues through dedicated cloud resources. While they can be provided by third parties (or maintained internally), private clouds are distinguished in that the servers involved are only used to support a single organization. This helps maintain the control over the network. In addition, hybrid clouds offer a middle ground to segregate data between private and public clouds as appropriate.

Typically, legal cloud providers are public cloud providers, with private and hybrid offerings generally offered by core IT infrastructure vendors, such as IBM, HP, VMware, and others. The leading voice for private clouds in the legal technology space has been Abacus Law. While its roots lie in practice management, Abacus Law has recently made strides as a hosted legal infrastructure provider through its Abacus Private Cloud environments. The provider takes an agnostic approach to its private cloud offerings that do not tie customers to its practice management solutions, or any sort of solution. In fact, the company has indicated its willingness to run other vendors’ solutions within its environments, effectively adding an extra layer of assurance for cloud offerings and flexibility for other applications.

Private clouds reduce some risk of public clouds, but are not a panacea. In particular, they do not necessarily alleviate the need to perform complete due diligence. Firms still need to understand the security related to servers and data transfer, particularly with respect to hosted solutions. Private clouds also do not protect the end access points of the solution.

Flexible Deployment

A third approach taken by vendors is to maintain flexibility in deployment, offering customers the ability to select cloud or on-premises options, rather than force them to use a particular offering. Generally speaking, these efforts are dictated by a desire to maintain flexibility to meet varying customer need. As such, in some part, they function as accommodations to cloud anxieties. Prominent examples of this strategy include Microsoft’s Matter Center for Office 365 and Amicus Attorney, both of whom have stressed the flexibility to offer public cloud, hybrid cloud, and on-premises offerings. Ediscovery vendors, who frequently encounter tensions between data storage, multi-party access, and high privacy sensitivity, have been particularly open to maintaining the flexibility of deployment options. To this end, Guidance Software, kCura, Recommind, Kroll Ontrack, and LexisNexis Concordance (to name a just a few) all offer options for hosted and on-premises solutions.

Ultimately for the vendors, this approach is about preserving opportunities by adapting to end-user comfort levels. For end users, it’s about obtaining the desired software capabilities with the flexibility to select or avoid the risks of cloud deployment. However, while this approach offers multiple paths, it does not necessarily answer questions about the vendor’s cloud solutions. In other words, while vendors falling into this category can often respond to end user preferences for deployment, firms selecting cloud options will still need to perform full due diligence regarding the solution.

Securing Access and Collaboration

The final category we’ll consider is primarily about securing the access points we mentioned above as much as anything else. If the other categories described largely related to differentiating solutions through reassuring firms about server and data transfer security, this category is about mitigating the risks associated with the expanded accessibility of cloud offerings. In other words, we’re discussing approaches intended to neutralize access-point risks.

Because this is a by-product risk of the legal cloud, rather than a barrier to adoption, this area has not received the same amount of focus as the approaches mentioned above. That said, a few players have sought ways to combat these issues, primarily by partnership with providers focused on supporting mobile environments. The primary strategy in this context has been to supply enterprise mobile management (EMM) or mobile data management (MDM) providers with expertise in supporting the distribution and control of data across large and diverse sets of device users. Leaders in this area include LexisNexis for its integration of Firm Manager with WatchDox, and kCura for its integration of Relativity Binders with MobileIron. Generally speaking, these integrations focus on combatting end-user risk by providing the capability to monitor, manage, and eliminate cloud access and data use on individual devices.

While the opportunities created by these integrations largely turn on the use of a particular legal function-oriented vendor (typically practice management), other vendors have focused on this particular need. To this end, EMM vendor AirWatch has sought to provide device and mobile content management capabilities independent of other solutions. Similarly, Box has focused on providing similar capabilities for managing and monitoring access to file permissions, access, and use from its storage environments. Microsoft’s Matter Center product responds to these concerns by keeping all data within cloud environments, eliminating local data exposures.

By and large, major movements in this area relate to either dedicated offerings, or integrations involving cross-enterprise providers tailored within the legal space. That does not mean that other options are not available. In particular, the last year has seen the entrance of TitanFile. TitanFile stands out as a provider focused on offering a secure collaboration platform for the legal space, without tying users to a particular data or document management environment. Rather, TitanFile encrypts files at the end-user source and serves as a content management and secure collaboration layer for attorney and client communications and document sharing.

Determining the Fit to Your Organization

Given the variety of paths that vendors take across these needs, it can be difficult for firms to compare providers to determine exactly what they need. In practice, this reinforces the need for self-education on the part of firms regarding the mechanics of the legal cloud. At the same time, it points to the need for a dedicated data security standard within the legal industry. The closest we currently come is ILTA’s LawSec efforts to disseminate ISO 27000 within the legal industry. While ISO 27000 is a prominent and well-regarded standard, it is not tailored to the legal sector.

There is a significant opportunity here for solution providers, firms, state bars, and professional associations to come together to develop a meaningful set of requirements and certifications for the industry. Even if it’s just an application of ISO 27000, the creation of industry-specific standards will go a long way to facilitate law firms’ understanding (and likely adoption) of security practices as well as help navigate a path through the extreme responses to the legal cloud.

Posted in Blog, General Function, General Industry, General Management, Legal, Legal Technology, Research, Security & Risk | Tagged , , , | 1 Comment

IBM + Xamarin: Is It an Enterprise Mobile App Golden Age or a Bubble We're In?

XamarinIn case you weren’t aware of it, IBM’s expansive mobile group has recently formally aligned itself in a partnership with Xamarin, a very sharp company that will now provide IBM with a mobile development platform that allows C# developers to easily build native mobile apps in C# for iOS, Android and Windows Phone. The Xamarin platform includes a number of developer tools as well as a cloud-based app testing ecosystem. On the IBM side IBM’s MobileFirst platform – which includes IBM’s own Worklight mobile app development platform – will provide Xamarin-built apps with cloud and backend enterprise connectivity and data services.

The Xamarin and IBM partnership drives home for me that mobile app development in the enterprise is becoming extremely “frothy.” Though I believe that we’ve been riding the enterprise mobile app wave for several years now, mobile app and MBaaS vendors alike are making a lot of noise about 2014 and 2015 proving to be the true “tipping point” years. For argument’s sake I will grant them this point. That leaves me wondering, however, if we now entering a true golden age for enterprise mobile app development, or if we are instead in the process of watching a bubble emerge that may be nearing its bursting point.

I will come back to Xamarin, IBM, and the question of an enterprise app development platform bubble. But first, a few more words on MBaaS platforms, which are important to Xamarin’s future success, are in order.

MBaaS Matters a Great Deal

Last week, I spent some time thinking on MBaaS (Mobile Backend as a Service) becoming the new enterprise mobile architecture of choice. There is one very interesting and key underlying notion about MBaaS: that its major goal is to give enterprises a great deal of freedom (or a liberation from the shackles of enterprise IT infrastructure, to put a bit of a literary feel to it) to focus their time and efforts on developing rich mobile solutions that meet “business needs.”

Cloud computing and platform as a service (PaaS) capabilities that easily replace old school infrastructure are two of the critical markers that define MBaaS. There are two other markers. The first is the ability to “easily” connect with the myriad backend app and data servers and other enterprise sources (that can include occasionally looney legacy systems such as an old early 1990s VAX system) that a business may need to tap. Extensive yet also simplified backend connectivity capability truly defines MBaaS – at least that’s what I think.

I can also add to the mix here DBaaS – the emerging Database as a Service “next wave” – which startups such as Orchestrate are moving to deliver on. From the 20,000 foot POV DBaaS provides a simple set of APIs that a company can utilize to connect to numerous and diverse backend database systems. I’m going to leave DBaaS for another day, but keep it in mind nonetheless.

The final marker is the very open-ended nature of MBaaS on the mobile app development platform side of things. As important as the cloud and backend services of MBaaS are to its immediate and long-term success, it will likely be the flexibility enterprises gain relative to the development tools they can use (such as Xamarin) to actually build their mobile apps that may prove the most significant marker overall in terms of what will ultimately be the greatest driver of MBaaS mass deployments.

From here, it is just a very short leap to extending the enterprise mobile app possibilities out to both the Internet of Things and to enterprise wearable tech. 2015 will indeed be a very interesting year for enterprise mobility!

Lots of Flexibility and Choice

Before I go on, I want to make absolutely clear that there is an enormous amount of complexity that underlies MBaaS. It has been an extraordinary technical challenge that the MBaaS vendors have taken on. Making cloud-based services and complex backend access and implementation appear “easy” to the enterprise – such that enterprise IT teams can almost think of an MBaaS as a nifty mobile development black box – is an unparalleled technical achievement. By this, I mean to equate MBaaS to the emergence and total integration of LAN/WAN in the 1990s, and the Internet/Web since the late 1990s, into the very DNA and fabric of all businesses large or small.

In a few years, all enterprises will have fully integrated MBaaS into their DNA as well. I will go so far as to say that I’m highly confident the security that is part and parcel of successful MBaaS platforms will be such that even today’s on-premise bound verticals – healthcare in particular – will all eventually find themselves MBaaS-based. The demise of on-premise computing is close at hand!

What the MBaaS vendors have achieved is a pure cloud and backend technical accomplishment. But in the grand continuum of enterprise mobility we arrive now at the ultimate judge or arbiter of any mobile application and development effort – the end user (whoever that may be – workforce, partners, customers, or large scale collections of consumers).

One thing MBaaS platforms won’t be able to ensure is the final outcome on how delighted end users will be with the mobile applications that are ultimately delivered through any MBaaS platform. The technical wizardry (and occasional black magic) employed by the MBaaS vendors can only go so far…they can and will free up enterprises to focus on their business needs, but they cannot help businesses actually develop their mobile-based business solutions and apps. Of course.

What MBaaS does do is create a great deal of freedom for enterprises to pick and choose the actual app development platforms that are preferred within an organization or that an organization’s development team may have expertise in. This approach maximizes developer flexibility, and minimizes the need for developers to have to use specific and likely unfamiliar tools required by a given platform.

The reason that the MBaaS vendors focus a great deal of marketing effort on the ability to create “agile” mobile app development environments for their customers is due to this developer tool flexibility. This flexibility in turn gives organizations a great deal of opportunity to focus specifically on business needs as the basis to quickly deliver finely-tuned mobile apps. This is something I will be exploring in detail over the coming weeks and won’t take any further here. It is worth mentioning, however, that the Xamarin-IBM partnership now exists at least in good part for this very reason.

See Related Research

Are Xamarin and IBM a Good Match?

As a front-end development platform and framework, Xamarin has gained a lot of ground in a relatively short period of time. It claims that its platform – which focuses entirely on C# developers – is now used by more than 750,000 developers, some of whom come from over 100 of the Fortune 500 (including Dow Jones, Bosch, McKesson, Halliburton, Blue Cross Blue Shield and Cognizant). That is a heady number of developers, and represents more than 10 percent of the total estimated population of just over 6 million C# programmers.

The partnership with IBM gives Xamarin’s developers integrated access to IBM MobileFirst – and IBM Worklight, which provides Xamarin-built mobile apps with an entirely new suite of secure cloud-based data services (sounds like MBaaS, doesn’t it?). Xamarin and IBM now provide an SDK for Xamarin developers that can simply be embedded in their mobile apps to integrate, secure and manage apps with the IBM MobileFirst platform.

There is much more to what the two companies actually put on the table, but the implementation details aren’t important here. What is important is that Xamarin is now able to provide IBM-sourced cloud and data services capabilities for those Xamarin developers that can benefit from it.

IBM, meanwhile, adds yet another arrow to its already full mobile quiver. Xamarin integration simply provides IBM with the ability to offer its enormous collection of mobile customers additional mobile app developer flexibility, and choice in how they want to – or prefer to – build their apps. Xamarin obviously also gains IBM’s mobile endorsement through the partnership; that will clearly open many new doors for Xamarin.

So yes, it is definitely a good match.

A Bubble or a Golden Age?

The answer to the question I’ve posed depends entirely on whether or not I’m right about how MBaaS is going to play out. If MBaaS does indeed emerge as technology that becomes part of overall business DNA (again, as LAN/WAN and the Internet/Web have become), then it makes a great deal of sense to have substantial app development flexibility and development platform and framework choice.

If MBaaS deployment runs into roadblocks, and if other cloud service options that limit developer choice emerge and become dominant instead, then the current proliferation of MBaaS and app development platforms (along with all the startups in the space) will indeed look like an unsustainable bubble.

That won’t happen, though – I like to think I’m right about MBaaS.

Enterprises really do face a tremendous need to get great mobile apps out the door – there is enormous enterprise demand now being generated for MBaaS and developer flexibility and choice because of this. Assuming that businesses take their strictly business-side mobile homework seriously, the infrastructure and development tools will be there to get high quality mobile apps out the door.

Red Hat/FeedHenry, Kinvey, Pivotal/Xtreme, Appcelerator, AnyPresence, Kidozen, Cloudmine, Sencha, Xamarin, Orchestrate and many other startup and established vendors (among them the usual suspects amid the giant tech companies) all stand to make a mark here. Enterprise mobility is ready to pay out on the bet.

For those of us who have been waiting since the early 2000s for such a mobile moment to become real, it is indeed looking like a golden age is finally here.

Posted in Blog, Executive Management, Finance & Accounting, General Function, General Industry, General Management, High-Tech, IT & Infrastructure, Marketing, Mobility, Research | Tagged , , | 2 Comments

The Wave Breaks, and the Analytics Cloud Is Revealed

Salesforce Analytics CloudEver since the “analytics cloud” was leaked by CEO Marc Benioff on September 15th, technologists and industry experts have been trying to figure out exactly what Salesforce was planning to do. These guesses were exacerbated by the knowledge that Salesforce has long been rumored to be interested in improving its own analytics. After all, they had already acquired a visualization vendor, EdgeSpring, in June 2013.

But until now, the guesses and punditry were based purely on speculation. Now that Dreamforce has arrived, both Salesforce and its partners have started to clarify details about Wave, the Salesforce Analytics Cloud, through public interviews and a series of press releases provided by companies joining the “Salesforce Analytics Cloud Ecosystem.” Some wondered if the analytics cloud would directly compete against BI vendors, while other guessed that it would simply be a stronger visualization layer. However, the announcements made this week are starting to provide focus to the true goals of the analytic cloud in a couple of ways.

First, a paragraph that all of these press releases share in common is their description of Wave, the Salesforce Analytics Cloud:

“Wave, the Salesforce Analytics Cloud, is the first cloud analytics platform that enables admins, IT and developers to work closely with business leaders to empower everyone to make smarter, data-driven decisions in seconds. Natively integrated with Salesforce1 Platform, Salesforce Analytics Cloud benefits from the trusted platform and enables admins to quickly drag and drop Salesforce data to deploy sales, service and marketing analytics apps. In addition, developers and IT can use new Wave APIs and other data connectors to easily connect to any other data sources to build any custom analytics app for any business function, or embed analytics into an entirely new generation of analytics apps and connected products for customers.”

This initial description indicates that Salesforce is planning to take greater advantage of the sales, marketing, and service data that it already holds, but that partners will be doing a lot of the processing, integration, back-end, and possibly even some front-end work to support this Wave.

Second, there is already a broad ecosystem willing to partner with Salesforce on this new wave. To gain a quick sense of the breadth and depth of this ecosystem, take a look at some of the vendors that have made an announcement on Monday, October 13th: 6Sense, Appirio, Apttus, Betterworks, BMC, C9, Cloud Sherpas, Dell Boomi, Dun & Bradstreet, FinancialForce, FirstRain, Fliptop, Gainsight, Informatica, InsideSales, Kenandy, Lattice Engines, Mindtouch, Model N, Mulesoft, Predixion, PTC, PwC, TalentObjects by Lumesse, and Wise.io.

These vendors break down into several basic categories: sales and marketing metrics, advisory and consultancy firms, customer behavior solutions, integration vendors, enterprise applications, and talent management. All of these capabilities are well established as either part of the direct or indirect services surrounding Salesforce’s core Customer Relationship Management functionality. Because of this, it seems more likely that Salesforce Wave will end up being very focused on line of business analytics rather than the traditional worlds of business intelligence and data management.

What will this mean for your business? In an upcoming Market Alert, Blue Hill will continue to analyze these announcements and discuss how the emergence of Salesforce’s Wave will impact key technical, financial, and operational stakeholders.

Posted in Analytics, Blog, General Function, General Industry, Research | Tagged , | Leave a comment

Red Hat, FeedHenry and Kinvey - Driving MBaaS as the New Mobile Enterprise App Play

MBaaSSeveral weeks ago, nearly $100 million dollars were put on the enterprise mobile app table, reflecting sharp bets that the new mobile application platform game in town is now MBaaS – Mobile Backend as a Service. The first of these bets was placed by Red Hat when it reached an agreement to acquire FeedHenry – in my opinion the most mature of the MBaaS independent vendors – for a substantial $82 million. For a company sporting a single previous investment round of $9 million that amounts to solid FeedHenry ROI for all concerned.

Nearly concurrent to the Red Hat-FeedHenry acquisition, the second bet came in the way of a Series B funding round for Kinvey, which I believe is a close second to FeedHenry in its MBaaS strengths and capabilities. The Series B is worth $10.8 million – coupled with an earlier Series A round of $7 million investors have now placed a $17.8 million bet on Kinvey, and quite honestly that looks to me like a damn fine bet to have made. I expect big things from Kinvey – and no doubt there is yet another major MBaaS acquisition to be made here at some point.

My detailed report on Red Hat’s acquisition of FeedHenry underscores not only the many good reasons why FeedHenry and Red Hat are a perfect fit for each other (Red Hat provides substantial enterprise scale and market penetration, FeedHenry gives Red Hat true enterprise grade mobile app capabilities), but also outlines the advantages that MBaaS platforms bring to the mobile enterprise application game.

Enterprise mobility has been around for more than 10 years, though it wasn’t relevant on a large scale until the current decade arrived and brought with it Apple’s iPad and iPhone and of course BYOD. Mobile development platforms (we all know the old acronyms – MADP, MEAP) have traditionally been built around client-server architectures, and have ruled the enterprise roost for many years. Some of these platforms were better for building monolithic single purpose mobile apps, some lent themselves to mobile Web development, and the majority of them have required dedicated application development and management resources.

MADPs have certainly served useful enterprise purposes over the years, and there would be no enterprise mobility today without them. But technology constantly evolves, and to make what is a very long story short, even the most state of the art of these now legacy platforms cannot keep up with today’s across the board enterprise move to cloud-based computing. Their client-server roots now work against them, and to evolve their core platforms into modern day enterprise architectures is not a viable option. Embracing HTML5, Web apps, and native-hybrid app development is as far as they’ve come…but it isn’t enough.

Today’s modern day architectures – SaaS, PaaS, BaaS and now MBaaS – are the realities of today’s enterprise world. Collectively, cloud computing and these services drive the ability for enterprises to become far more interactive and dynamic. Mobility – driven by both consumer and workforce demand – now requires enterprises to literally “live in the moment.” Client-server mobile app platform architectures simply don’t lend themselves to tackling today’s definition of dynamic interactions.

See Related Research

Today’s Definition of “Dynamic”

Ironically, old mobile apps used to define dynamic enterprise applications – they were of course built to interact with users (whether workforce or external consumers) under anytime, anywhere conditions. But in truth these apps have ultimately tended to be static entities – once built, more often than not at significant development cost, they become single purpose apps.

Today’s definition of dynamic mobile apps has changed, and I’m not understating it if I suggest that the definition has changed radically. Those enterprises that can lay claim to being today’s mobile pioneers are those that are not only extending mobility out to the workforce, partners, customers and consumers, but they are as well living in the moment in building mobile apps dynamically.

The MBaaS mobile vendors all have a specific common view of how enterprises must now build their mobile apps to stay competitive or to gain competitive advantages:

- Primary stakeholders – line of business (LOB), IT and finance need to always be able to come together and quickly develop apps that meet revenue-generating and/or business intelligence gathering requirements (this is in fact our own Blue Hill mantra for any enterprise technology development)
- Other key stakeholders – for example the CMO’s office, need to be easily integrated into planning and development processes
- IT must have at its disposal significant flexibility to quickly bring together in the cloud both front end/user interface capabilities and easy backend connectivity to numerous potential resources
- Both mobile app planning and development must be collaborative and agile in nature – this requires embracing a myriad of development frameworks that are likely to already exist within a business and that developers already know well
- The key MBaaS competitive advantage is that all of the underlying complexity is handled by the MbaaS platform, leaving enterprises free to focus specifically on business issues, user interfaces and – most critical of all – in the moment development of apps that can be quickly put into the field
- FeedHenry in particular strongly suggests that in today’s dynamic and in the moment world many mobile apps should be “disposable” – that is, they can be built quickly and at very low cost to serve specific needs that are likely to be short lived in nature

I think that paints the right picture.

I and my Blue Hill mobile analyst colleagues are all of a mind that MBaaS is the necessary new enterprise mobile app game. There are a number of key vendors in the space – Red Hat/FeedHenry, Kinvey, Pivotal/Xtreme, Appcelerator, AnyPresence and Kidozen key among them – that all enterprises will soon get to know quite well. Those that don’t embrace MBaaS will find themselves at competitive disadvantages.

I am now in the process of researching MBaaS-based enterprise mobile app development. Blue Hill Analyst Insight and Anatomy of Decision reports are in the works and they will go into significant detail on what I’ve only scratched the surface of above.

Before wrapping up I will note that all of the old caveats for mobile app development (at least “old” as of the end of 2013!) still apply. Even in an emerging world of disposable and in the moment mobile app development, mobile strategy remains, as I am extremely fond of claiming, “a long term strategy and never a short term fix.” In truth the more things change the more they remain the same – I’ve also long been extremely fond of noting that the three key business components of any enterprise mobile app project are: rapid development, speed to market and VERY reasonable cost.

I’m thrilled the MBaaS vendors are embracing these concepts. I am even more thrilled that through these MBaaS vendors enterprises will in fact be able to actually deliver on them as well!

MBaaS is the smart enterprise bet to make.

Posted in Blog, Executive Management, Finance & Accounting, General, General Function, General Management, High-Tech, IT & Infrastructure, Marketing, Mobility, Research | Tagged , , | 4 Comments

Informatica Springbok Shows How Data Is the New Bacon

Data is the New BaconData is the new oil. Data is the new currency. Some even say data is the new bacon. Regardless of what you think of it, the ability to contextually prepare and cleanse data is a core business activity that has expanded far beyond IT. The reason is simple: there is a contradictory trend in the business world where line-of-business users become more savvy about their data requests, while the flood of data increases exponentially. This leads to a demand for data quality that far outstretches IT’s ability to manage and cleanse data adequately.

As a result, data analysts spend an estimated 70-80% of their time simply manipulating and organizing data, rather than actually analyzing the data, as they make sure that zip codes, Dun and Bradstreet IDs, geographical information, and other basic business information are correctly formatted and do not conflict with related data. For instance, a zip code of 98501 would not match up with the state of Maine, but with the state of Washington. These inconsistencies can lead to greater problems when companies seek to use their data to conduct relevant analysis.

Although the ability to notice that a Washington zip code with the correct state is easy to do on a one-off basis, it becomes far more challenging to do systematically. Traditionally, the answer to this problem has been to send the data to a database administrator or data analyst who could properly join, query, and identify data inconsistencies by writing SQL or other query language code. Despite the hype around the value of data-driven decisions, the barrier to entry for data editing and quality efforts has traditionally been too steep for line-of-business personnel to easily traverse. As a result, data management has traditionally been trapped in the role of IT rather than the role of business; even simple data transformation and cleansing efforts get stuck in a repetitive cycle of requests sent to an analyst who lacks business knowledge which lead to further requests to properly cleanse data to the end user’s satisfaction.

Companies are quick to acknowledge that line-of-business personnel are best suited to actually sort and fix data sources to meet business needs, but have traditionally lacked the tools to properly cleanse data. Although traditional ETL vendors such as Informatica, Talend, Oracle, and Pentaho have provided data quality tools for a long time, these tools have either been too expensive or too complex to be used by line-of-business personnel. In light of this gap, startups such as Paxata and Trifacta have emerged to handle this challenge. Although these tools are easier to use than legacy products, and have led the way in simplifying data transformation, they lack the ability to save and recommend data transformations commonly used within an organization, nor do they scale the transformation to enterprise level so that data is continuously improved and updated on an ongoing basis. They don’t take this next step in end user support.

In this context, Informatica Springbok is an interesting new data quality product. First announced at Informatica World in May 2014, Springbok is now in General Availability and allows end users to cleanse data by providing them with automated data suggestions based on data format, type, and context. It also provides semantic analysis of all data ingested to understand whether the data represents geographical, topographical, or sociological data. Springbok also supports automatic joins between datasets by automatically identifying columns that can be matched. This functionality allows end users to directly sort and cleanse data without having to make requests to IT departments. However, when a user creates a sorting or cleansing effort, Informatica also takes the next step of allowing IT departments to put this logic directly into Informatica’s PowerCenter offering as well, which allows companies to actually scale any self-service data quality efforts without having to recreate or rewrite these efforts. This innovation is especially interesting as it allows the data transformations that have typically been trapped in individual spreadsheets to be quickly implemented throughout the enterprise if they end up being useful.

Springbok also has a social aspect that has largely been missing from data quality efforts. Although other products allow multiple users to cleanse data simultaneously, they do not allow end users to actually search through a library of data transformations easily, or to suggest data transformations based on an end user’s current role, location, or context. As an example, if a sales person finds a way to effectively sort leads in a way that provides greater accuracy, Springbok could recommend this sorting capability to other sales people. Springbok can also provide application suggestions based on the data sources that are used by various roles. For instance, if a sales person were using CRM data, Springbok might see that this data is often combined with marketing automation data, and provide the user with a suggestion to also access marketing automation software.

Although Springbok’s support of on-premises data and legacy Informatica solutions may not be as exciting as the ability to open up data to line-of-business end users, this support does allow Springbok to be used even in organizations that may not be running on the most current software or databases. This is especially important in verticals where companies do not need to upgrade software and data environments, but still desperately need to understand and cleanse their data to make better decisions. By using a self-service data quality solution, end users in these environments are positioned to take advantage of an existing data warehouse investment, or to simply analyze data collected from applications that lack their own embedded analytics or data standardization tools.

Blue Hill also found that Informatica was not simply resting on its laurels with the launch of Springbok, as Informatica’s Vice President of Strategy, Keyur Desai, also mentioned the potential of supporting product information data and master data management efforts in the future through the same logic used in Springbok. This would be a welcome change from transforming technologies that have typically been controlled solely by data analysts to solutions that are actually accessible to the line-of-business users who should theoretically be able to take advantage of this data. Although Informatica Springbok’s social recommendations, scalability, and reuse already make it a valuable tool across all organizations seeking to improve data quality, the potential to bring a Springbok-like user interface and intuitive tools across Informatica’s portfolio is an even more transformative opportunity for enterprise data management.

View the Related Research

If data is the new oil, Springbok is the new refinery. To maximize the value of data, companies must make data clean, relevant, and accessible. By allowing business users to quickly summarize and fix data inconsistencies, conduct role- and context-specific editing of data, and view relevant data sets and sources, Springbok should be seen as an immediate opportunity to bring data quality into the hands of end users, both for existing Informatica customers, and for non-customers seeking a standalone tool that is friendly to business users that do not have formal training in enterprise data quality and management techniques. Informatica is currently making Springbok available with 50 GB of storage, 10 million rows of import data, and 20 million rows of export data available monthly at no cost. Users can register for the freemium version of Springbok and Springbok users can also join a user community for help requests.

Posted in Analytics, Blog, General Function, General Industry, Research | Tagged , , | 2 Comments

Abacus Private Cloud for Secure And Flexible Law Firm IT (Infographic)

A cloud platform makes systems and servers remotely available from anywhere with an internet or mobile connection. As such, cloud platforms expand a law firm’s access to it’s documents, files, and applications while reducing IT costs and burdens. At the same time uncertainty complacence regarding security, data ownership, and attorneys’ obligations regarding technology due diligence and confidentiality has led to two extremes: (1) disregard for cloud drawbacks and (2) anxiety and total refusal to adopt cloud solutions. To help firms’ evaluations of solutions, this infographic profiles two small law firms’ investigation of cloud options and selection of Abacus Private Cloud.

Please use the form on the right to download this infographic.

infographic-post

Posted in Legal, Legal Technology, Multimedia, Research | Tagged , | Leave a comment

Ericsson to Acquire MetraTech - Internet of Billable Things Heats Up

Earlier this week, Ericsson announced its planned acquisition of MetraTech, a Boston-based billing and settlement solutions provider who services a diverse customer base including the likes of Accor Group, Concur, DTCC, LifeLock, Microsoft, O’Hare International Airport, PGi, and Telus. The acquisition, set to close by the end of Q3, makes a lot of sense as companies look to get a jump on the Internet of Billable Things™ (IoBT). It provides Ericsson with a flexible solution that can adapt alongside still-developing business models and customer relationships, and plays well with their fast-growing OSS/BSS business. It also lines up well with trends we’ve seen developing in platform-based innovation, particularly in light of PTC’s acquisition of ThingWorx.

Please use the form on the right to download this report.
Ericsson to Acquire MetraTech

Posted in Executive Management, Finance & Accounting, Financial Operations, General Industry, General Management, IT & Infrastructure, Mobility, Research | Tagged , | Leave a comment

At Bustling, Heavily Attended SAPPHIRE NOW 2014, SAP Intros its Own Take on KISS

SAP_1KISS – you know, Keep it Simple Stupid…is a directive that is usually quite appropriate whenever technology is involved. And now thanks to SAP AG we can rejigger that a bit to read, “Keep it Simple SAP.” More specifically “Run Simple” is now officially SAP’s new tagline, a new marketing strategy, and the company’s new cornerstone for its product development strategy. It is a very interesting move for SAP for various reasons and on a variety of levels.

The obvious ones are what SAP’s now sole CEO, Bill McDermott, emphasized during his SAPPHIRE NOW 2014 keynote address last week. There are a number of things simplicity brings to the game for both SAP and its customers: decreased complexities (of course), ease of use, far better UIs, increased productivity, speed to market, lower TCO, and increased strategic advantages – which more often than not equates to non-trivial increases in top line revenue. Increased top line revenue + lower TCO lead of course to non-trivial increases in profitability. That is the business side of things.

On the technical side SAP is essentially engaged in, to paraphrase SAP Chairman Hasso Plattner from his SAPPHIRE keynote, “a full scale exercise of radical, disruptive innovation” of SAP’s core platforms (that amounts to tackling over 400 million lines of existing code!). Keeping things simple is a hugely complex undertaking. Plattner and various guest speakers deliver lots of compelling insights in his keynote. In particular, Plattner also shared the stage with Harvard Business School Professor and author of “The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail,” Clayton M. Christensen – the two of them together make the keynote a must watch.

The other day my research colleague Scott Pezza (@scottpezza) – a braver soul than I – provided great insights into the Run Simple mantra, and used SAP’s own Simple Finance (SFIN) on HANA to illustrate the Run Simple possibilities. Make sure to read it! In truth it would take roughly 200 blog posts to cover everything SAP delivered on but Scott’s attempt to captures it is right on the money. My goal is to capture the somewhat more amorphous issue of what underpins Scott’s post and the other 199 posts we could write had we but world enough and time, and why I believe SAP will pull it off.

The issue of simplicity – to put it as simply as possible – has at its core SAP’s full-throttled move into cloud computing, and the continuing success and adoption it has found for its in-memory database HANA (3200 SAP customers and rapidly growing).

I consider HANA one of the seminal IT developments of the decade and Scott provides some useful HANA insights from Plattner’s keynote in his post as to why it is seminal and critical to SAP’s future. Let’s just say that real time, on the fly analysis on terabytes of data is non-trivial and an absolute requirement for any business to succeed going forward. HANA is a platform I will keep a very close eye on going forward – especially as it has numerous implications for mobility, but for now we’ll leave HANA for the rest of this post.

A Head in the Clouds
It is important to touch on the cloud computing end of things. SAP is clearly headed to an all cloud, all the time model. This is in fact a good thing and over the long term it will begin to pay dividends for SAP and its customers as it continues to invest in its cloud computing initiatives, especially its HANA in the cloud initiatives.

Just how much SAP is diving in cloud computing has been fairly apparent for some time. But perhaps the first real indication of how important the cloud is becoming (or will become) was the announcement SAP recently made back on May 6, 2014 that it is teaming with virtualization software and budding mobility vendor VMware to offer SAP’s software platforms and HANA as cloud-based services in pre-configured ways. Instead of buying separate SAP and VMWare software packages and running them on dedicated servers, customers will be able to load their data and analyze it at a much lower cost, and with a greater operational flexibility, in the cloud.

EMC (which owns a majority stake in VMWare) and Oracle have both already announced similar cloud moves and the idea should certainly not befuddle any SAP customers. It is the necessary next step for SAP to become a fully 21st century software player. In working with VMWare directly, SAP will take the issue of its customers having to find a virtualization partner off the table – call it operational simplicity, which is a permutation of Run Simple. Of course. The VMWare partnership begins to demonstrate how SAP’s Run Simple future will manifest itself.

There is one significant gap in the SAP cloud computing storyline…while SAP did an excellent job of detailing what it intends to accomplish through its cloud computing initiatives, it is clear from the SAPPHIRE NOW attendees that they would like to have seen something more than vague promises on the timelines for when SAP hopes to actually deliver on the various components of its cloud initiatives. SAP no doubt knows that it needs to put real timeframes in place and I am certainly sure that it will do so.

One key to HANA’s ongoing success has been SAP’s clear ability to compellingly demonstrate what it does. Over the last several years there have been very, very few instances where SAP was “all talk and no action” on HANA – I actually can’t think of any. The power of HANA has been easy for SAP to demonstrate both through customer deployments as well as through its own use of HANA within a variety of industry solutions SAP sells. SAP has even been able to get smaller businesses on the HANA train by partnering with Amazon/AWS to provide SMBs a way to access its power – no need to be a Fortune 250 business, a strong message on many levels.

SAP will be well advised to ensure that it moves quickly to also demonstrate the power of the cloud. That is, Run Simple cannot just sit out there as a marketing tag line that cynics and competitors can then attack. The tag line needs to immediately translate into actionable events for SAP’s customers. Yes, of course SAP knows this, but in case there are any members of SAP’s Supervisory Board team that think SAP will be ok to slide on announcing real timelines they would be very mistaken.

But let’s not make light of just how complex an undertaking this is for SAP and how complex it will be for SAP’s customers – especially the top 5,000 or so. There are enormous SAP on-premise implementations that will remain earthbound for years – if not decades – to come. IBM’s SAP Global Services teams (IBM is SAP’s single largest partner) will certainly not have to worry about seeing engagements suddenly come to an end. To put the overall challenge for SAP into better perspective, consider that SAP has somewhere in the neighborhood of 225,000 customers and roughly 2 million users.

It is indeed a long term project – the transition to cloud-based offerings is in truth 4 years old already, and HANA’s genesis goes back at least 7 and a half years – for both SAP and its customers to undertake – and clearly it will be SAP’s responsibility to ensure the Run Simple message doesn’t simply become a “throw away” marketing line. The truth is that for its massive customers it will be a very incremental process – that is fine as long as SAP is able to manage expectations and deliver the goods.

Interestingly, it will be the smaller SAP shops that will cash in sooner than later. In a sense, this is what cloud computing has always promised right? Can’t afford an IT staff? Go to the cloud. Don’t want to invest hundreds of millions of dollars in your own data center? Get on the cloud. Run Simple will be a grass roots endeavor but I believe it will quickly prove itself – SAP should be able to grow its cloud future quickly enough and become mature enough over time to finally bring the top 5,000 in line – maybe 5, 10 or 15 years down the road.

see_related_research

SAP CEO Bill McDermott’s Star
I am a fan of Bill McDermott. As a co-CEO, which he became in 2010, he has successfully tread a fine line between helping to push SAP into stepping up its game looking forward, and ensuring that SAP’s old management guard wasn’t thrown for any loops. In the past there has always been a strong sense of SAP looking slightly forward while remaining tightly locked into its historic roots. I’ve long wondered if SAP would be able to make the transition to becoming a fully 21st century company.

When SAP finally announced that McDermott’s thoroughly home grown co-CEO Jim Hagemann Snabe would step down and that McDermott would become sole CEO in 2014 I was quite pleased to hear it. McDermott has a real fire under his feet and will, I believe, quickly push SAP (but remember that “quickly” is a very relative term here) into the 21st century.

This year’s SAPPHIRE NOW CEO keynote fully underscores this.

Interestingly, before McDermott took the stage SAP introduced three ridiculously young folk on the cutting edge of today’s technology world (watch the CEO keynote video, they are very impressive young people). One reason they were there was to underline the notion of the need for all of us – and especially SAP and its customers – to undertake “disruptive innovation.” Why is this important? Consider the following:
SAP_Disruption

These three charts – which Professor Christensen provided during his segment of Plasso’s keynote, together show the different types of innovation that are possible. It’s not my goal here to rediscover all of the notions and implications inherent in Christensen’s Innovator’s Dilemma paradigm, but it’s worth a very quick run through to understand where SAP and its customers all find themselves.

The chart on the left demonstrates “sustained (albeit incremental) innovation” – which in fact is nothing more than constantly improving an existing product. Think of it in SAP terms as SAP continuing to merely improve its on-premise-based platforms, with incremental improvements that merely lead to more of the same, if marginally improved, approaches SAP customers have traditionally taken. Note that the trend line sits at the very top of the chart, meaning that a lot of time is absorbed (typically as part of the process of “listening to customers at the top of the food chain”) in delivering minor improvements. Think of it as delivering slightly better versions of business as usual.

Innovation becomes disruptive only when it causes wholesale changes in how a given market (in this case SAP’s customer base) does business. Christensen’s belief (as detailed in his book) is that successful companies (in this case SAP) more often than not put too much emphasis on customers’ current needs (sustained innovation), and fail to adopt new technology or business models that will meet unstated or future needs (in a sense this echoes Steve Jobs, whose mantra was that customers don’t know what they want, it’s our job to tell them). When companies fail to adopt these new technologies and business models they will eventually fail.

In the case of SAP the disruptive new technology is cloud computing, where the fastest growing competitors to SAP operate. To its credit SAP has not only fully embraced cloud computing but has delivered disruptive innovation of its own on top of it with its in-memory and cloud-based HANA database.

The middle chart demonstrates the introduction of new technology delivered much more quickly than would be the norm under sustained innovation. The problem here is that for very large companies it is often very difficult to move as quickly to adopt as the new technology itself becomes available. Recall our point above that SAP’s largest customers will very likely take decades to move away from older on-premise platforms. Ah, there is the rub, or the dilemma.

The third chart essentially states the obvious – enough customers need to be able to take advantage of the new/disruptive technology or the new innovations simply won’t matter. As a majority of customers do adopt the disruptive technology, those at the top of the food chain will have no choice but to follow and adopt as well – or per Christensen they will ultimately fail.

This brings me back to Bill McDermott. As detailed in his keynote, McDermott’s strategy for ensuring that his customers, whether large or small, can adopt, absorb and utilize SAP’s new cloud and HANA platforms, is to ensure that it will not only be simple for them to do so, but that in turn adoption will lead to major simplifications in the numerous customer business and technical processes involved in using SAP’s platforms, services and products. Hence Run Simple. Scott Pezza’s blog post noted earlier provides useful examples of this.

SAP_McDermott

Run Simple strikes me as a uniquely Bill McDermott strategy. It is forward looking and seeks to unshackle SAP from its past, so to speak. I am not convinced that SAP would have been able to put such a strategy in place under the old co-CEO structure, and it is the main reason I’m glad to see McDermott firmly in sole possession of the role.

Yes, there is the Supervisory Board to contend with, but it is McDermott’s responsibility to ensure that SAP finds the right singular voice with which to preach, and the right message to preach with it. Run Simple is the message and from what was demonstrated at SAPPHIRE NOW 2014 I am convinced that the entire company has that singular voice in place to move forward. I recommend watching McDermott’s video – there is much there to take in.

Roadblocks?

There certainly are a few.

One that comes to mind – the biggest in fact – is the SAP sales force, especially those on the elite end of the force who have traditionally focused on huge on-premise platform sales to perhaps a single company. How does such a collection of uber-SAP salespeople transition to selling a cloud-based subscription model? How do they go from one-off sales of premise-based platforms and the typically huge professional services revenue that goes with them to a model that revolves around recurring revenue and a Run Simple strategy that suggests a major drop in professional services revenue?

Next, I found it disturbing to learn that Vishal Sikka, who was not only informally known as the father of HANA but in fact thought of HANA as his child, recently resigned (for personal reasons – but sometimes that is merely a code for something else, perhaps a clash with McDermott as he took on his new role). And a mere few days later, Shawn Price – who had been the outspoken evangelist and leader for SAP’s cloud initiatives, departed as well. Perhaps the two of them, despite working on SAP’s disruptive efforts, were too “old school SAP” for McDermott. Are the departures an internal SAP roadblock or do they open the road up? Hard to really say.

We noted this earlier but it is worth repeating – SAP needs to add real timelines to its product delivery plans. Customers need to know.

Finally, as impressive as the Run Simple message is and as impressive as what it all promises is, only time will tell if SAP will indeed deliver on all the promises. There are significant customers already on board with where SAP is headed – for example, ConAgra and John Deere are both moving ahead with new and quite complex implementations (both of them provided insights during Plasso’s keynote). SAP will need many more of these types of customers to come on board at the high end of the customer spectrum. And SAP needs to demonstrate that tens of thousands of its smaller customers will very quickly get on board.

The irony of course is that Run Simple is an incredibly complex undertaking for SAP. The hardest thing to accomplish in technology is to keep things simple. There is invariably an inverse relationship between overt simplicity and the complexity needed to make it happen.
I believe SAP has the right CEO in Bill McDermott to pull off all of SAP’s grand plans. I will be keeping watch on SAP’s doings – primarily as those doings relate to enterprise mobility, but then, mobility now touches everything in the enterprise…

My money is on SAP (and its customers) to succeed.

Posted in Analytics, Banking & Finance, Blog, Communications Service Providers (CSP), Enterprise Risk Management, General Industry, General Management, Government, Healthcare & Biotechnology, High-Tech, Insurance, IT & Infrastructure, Legal Technology, Manufacturing, Mobility, Operations, Security & Risk | Tagged , , , | Leave a comment

Latest Blog

Blue Hill Research Communications Lifecycle Management Highlights: May 2017 Blue Hill Research Communications Lifecycle Management Highlights: April 2017 This Week in DataOps: Rain, the Real World, and Another Manifesto (the Good Kind)

Topics of Interest

Blog

News

BI

Big Data

Cloud

Virtualization

Emerging Tech

Social Media

Microsoft

Unified Communications

GRC

Security

Supply Chain Finance

Procure-to-Pay

Order-to-Cash

Corporate Payments

Podcast

Risk Management

Legal Tech

Data Management

Visualization

Log Data

Business Intelligence

Predictive Analytics

Cognitive Computing

Wearable Tech

Salesforce

Sales Enablement

User Experience

User Interface

Private Equity

Recurring Revenue

ILTACON

Advanced Analytics

Machine Learning

IBM

IBM Interconnect

video platform

enterprise video

design thinking

enterprise applications

Tangoe

Managed Mobility Services

Strata

Hadoop World

DataOps

service desk

innovation

knowledge

design

usability

USER Applications

ROI

Time-to-Value

AI

Questioning Authority

Domo

Yellowfin

Nexla

DataKitchen

Iguazio

Trifacta

DataRobot

Informatica

Talend

Qubole

Pentaho

Attunity

Striim

Anodot

Tableau

IoT

fog computing

legacy IT

passwords

authentication

Switchboard Software

GoodData

Data Wrangling

Data Preparation

TWIDO

Information Builders

Analytics

Enterprise Performance Management

General Industry

Human Resources

Internet of Things

Legal

Mobility

Telecom Expense Management