In most areas of business, inaction can be just as impactful as action. Enterprises typically view the cost of not acting as an opportunity cost, or, at most, an indirect bottom line impact. But for enterprise mobility, not acting has both direct and indirect cost implications from lost financial, technical, operational, and strategic value. In fact, the direct monetary cost of not acting is actually higher than the cost of Managed Mobility Services. How can that be? Let’s dig into the numbers…
By Blue Hill estimates, unmanaged direct mobility costs can be 20% overweight compared to a managed environment. For the average billion-dollar revenue company – with $5 – $10 million in telecom/mobility spend – expense management alone can be a million-dollar savings opportunity.
Unmanaged environments generate significant costs from fees (such as late fees or overage charges), as well as service order placement and support. Apart from monetary costs, not acting also presents opportunity costs from lost productivity and technical debt. Potential revenue-generating activities are re-allocated to overhead or administrative tasks, and device downtime is frequent and lengthy. Finally, the enterprise does not have a coordinated, long-term mobility strategy in place, and thus faces strategic costs.
The cost of not acting can be substantial. But what if the enterprise does act, using in-house resources to match the capabilities provided by a third-party MMS vendor? Typically, it will spend more, and receive a lower level of service than an MMS vendor can provide. Between helpdesk, email, security, and invoice management, enterprises devote two-to-three full-time equivalents for every 1,000 devices based on Blue Hill discussions with enterprises. Based on an average annual salary of $63,000 for an entry level telecom engineer, and a 1.3 multiplier for the fully loaded cost of an employee, this results in a labor cost of approximately $164,000 – $246,000 per year.
Direct mobile costs, fees, and service order costs can be reduced somewhat, compared to an unmanaged environment, but at the expense of using IT resources for low-value, high-effort tasks such as resolving carrier disputes or sorting through bills – tasks that are not a core use of IT resources. Finally, in-house support costs are significantly higher than the support costs for Managed Services, as support is often bundled into the MMS contract. To achieve the same level of service in-house would require significant internal support resources. While enterprises may be able to re-create the capabilities of an MMS vendor in-house, they will typically do so at a much higher monetary, opportunity, and employee cost compared to a dedicated MMS vendor.
Managed Mobility Services comes out ahead. Overall, Blue Hill estimates that the direct device and data costs of a standard enterprise can typically be driven down to less than $60 per device per month through a coordinated, well-managed, and effectively-sourced approach. But cost savings are not the most impressive part – more impressive is the return on investment generated over a three-year period by Managed Mobility Services.
Based on conservative Blue Hill estimates, the cost reduction from IT resources and carrier expenses alone can result in a 3-year ROI of 184% based on an assumption of 20% carrier savings in the first year that reduces as the environment is optimized, as well as 50% IT overhead savings. Blue Hill has seen carrier savings in excess of 40% and an elimination of direct in-house IT mobility support, which would increase this ROI substantially. Blue Hill notes that, even with conservative estimates, Managed Mobility Services can provide higher levels of service compared to in-house management, and a three-year ROI ranging from 150% – 450% by reducing monetary, opportunity, and employee costs for the enterprise.
There is a clear cost of not acting for Managed Mobility Services. Blue Hill describes this problem in greater detail in our recent report, aptly titled, The Cost of Not Acting for Managed Mobility Services. Blue Hill did the math: Managed Mobility Services provide a higher level of service than in-house or unmanaged environments, while delivering a significant three-year return on investment.
Ask a line-of-business or IT manager to list, one-by-one, each step that her enterprise took to end up at its current mobility strategy, and she likely won’t be able to tell you. Business leaders aim to take steps that lead logically toward a well-defined, long-term vision, but for enterprise mobility, adoption rarely follows this path. Often, an enterprise will invest in mobility or adopt managed services in stages, in an inefficient, and potentially costly strategy that delays the enterprise’s ability to unlock the strategic value of mobility and to achieve digital transformation. There are many roads to Managed Mobility Services, but taking the one less traveled – adopting a full suite, managed services contract initially – might make all the difference.
Businesses were slow to predict the impact that mobile would have on their workforce and on their business operations overall, and thus few enterprises put a long-term, cross-departmental mobility plan in place before beginning to invest in mobility. This left many with mobile environments that support multiple carriers, device types, applications, and departmental policies without a coordinated, organization-wide approach that spans purchasing, logistics, implementation, kitting, replacement, bill pay, and so forth.
Though this piecemeal approach is sub-optimal, once in place – short of a significant business case being made or a major catalyst event forcing the enterprise to act – it is likely to remain out of simplicity, and to avoid the need to address and prioritize the various stakeholder interests involved in enterprise mobility.
Generally, expense management presents the clearest business case for an enterprise to pursue managed services, due to the visibility of expenses in the enterprise. Thus, an expense management contract will often be adopted first. By Blue Hill estimates, unmanaged direct mobility costs can be 20% overweight compared to a managed environment. For the average billion-dollar revenue company, telecom and mobility spend averages $5-10 million per year, making this a million-dollar savings opportunity.
After businesses have made the case for Telecom Expense Management (TEM) solutions, they will often pursue additional managed services to achieve greater cost savings and efficiency gains. Blue Hill documented the costs and benefits of various mobility strategies in our recent report, The Cost of Not Acting for Managed Mobility Services.
The timeframe for adopting components of MMS varies, but most enterprises generally seek to support financial, technical, logistical, and/or strategic needs through managed services. Successful Managed Mobility Services support some or all of the following components of enterprise mobility:
Financial: contracts, invoice management, payments, data consumption/roaming, dispute management
Technical: kitting, staging, content, data, identity, security, apps
Logistical: sourcing, device fulfillment, device repair/replace, device replenishment
Strategic: mobile business assessment, health and security check
Enterprise mobility needs become more complex over time, and rise up the enterprise hierarchy from the basic ability to use mobility, to security and governance, to the widespread adoption of mobility, and then finally to more strategic or transformative uses of mobility. Enterprises can achieve these high-level hierarchical needs through multiple managed services contracts that the enterprise has invested in over time. However, the greatest strategic and transformative value for managed services is achieved when an enterprise pursues a full-suite Managed Mobility Services contract initially, giving the vendor visibility into all areas of the enterprise’s mobility environment: expenses, operations, logistics, and even applications and security settings at the device level.
Utilizing a single vendor for managed services creates synergies in financial, operational, technical, and strategic value by placing all responsibility with a vendor that acts as a single point of contact for all enterprise mobility needs. By managing all aspects of an enterprise’s mobility strategy, the MMS vendor can seek cost savings and efficiency gains throughout the entire mobility lifecycle, with a greater understanding of how to optimize the environment from a financial, operational, technical, and strategic standpoint.
For enterprises with an existing TEM contract: it’s not too late! Blue Hill recommends that these enterprises pursue opportunities for additional managed services with their existing vendor relationship, or look to outside vendors if their TEM vendor does not also support Managed Mobility Services. For enterprises with an unmanaged mobility environment, Blue Hill recommends considering a single vendor for all Managed Mobility Services to achieve the greatest potential strategic value from the relationship.
Anyone who has met me observes two things: I have pink hair and I am passionate about aligning B2B sales and marketing. Enterprise buyers today shop for solutions the same way they buy TVs and jeans. And that’s forcing a change in the way enterprise sales and marketing communicate with the buyer, and the tools we use to support that.
In the past couple of years, there has been an explosion of new solutions in what is being called Sales Enablement. But what is “enablement” exactly? It’s not in the Merriam-Webster dictionary. So we have a challenge: a new industry that’s busting at the seams, ripe with confusion and noise, defined by a term that’s not a real word. That’s exactly why I joined Blue Hill Research. I want to help create some meaning and structure out of this chaos.
I’ve been watching the advancing changes in sales and marketing since 2009, as LinkedIn was just getting traction with 50 million users (today they have over 400 million) and we all started to get a lot more connected with our buyers via mobile and social. At the time, I was managing an inside sales team and we were noticing that our buyers were behaving differently, that they were spending time in forums and social media discussing best practices and getting input from one another on what solutions to consider. We began experimenting with new approaches including social media outreach and listening, account-focused research and targeting, and leveraging specific content across the sales cycle. With these new approaches and experimentation, we cut our sales cycle by more than half, beating the industry average overall by a considerable amount. We learned that we could differentiate by selling differently.
Yet, we struggled with the tools we had to work with, and providing metrics and reporting to show management what we were doing was nearly impossible. That experience got me hooked. I wanted to figure what was going on with buyers and how I could learn more and improve on that experience in the sales and marketing functions. Recently, I’ve noticed a growing number of tools, in addition to the original CRM systems, designed to help sales people more easily, quickly, and effectively engage with the buyer in their journey. A lot of them tap into the mobile and social aspect of engagement today. Automation, AI and predictive analytics are being tried out in the sales world, a place they had never been before. Sales is demanding more educational content tailored to each step in the decision process. Social Selling became a “thing”. In 2014, I joined the association for inside sales professionals (AA-ISP) and became chair for the San Diego chapter. The AA-ISP is a global organization dedicated to the profession of inside sales and sales development. At their last annual leadership meeting, the number of new vendors in the expo was astonishing. All seem to have similar messaging. How can one tell them apart? How does an enterprise figure out which ones fit together best for their company? Which are viable? Where does one start to build requirements for these new tools? How can you ensure adoption once implemented? Which ones are working, which ones aren’t? What’s the ROI on any of this?
This is why Sales Enablement excites me. It’s a brand-new space. It’s an industry that’s forming, learning, maturing. There is a lot of stuff to figure out. There will be consolidation. There will be successes and failures. It’s stimulating to be witnessing the birth of a new category, one that is serving a new and evolving need. Joining the Blue Hill Research team, I bring my experience from the other side of the table when I was part of the emerging Telecom Expense Management industry in the early 2000’s. As a TEM vendor, I relied on groups like Blue Hill Research to help us figure out what the market wanted, honing our platform, our service delivery, and our implementation process as we matured. That experience will help me to ask the right questions and provide guidance to this maturing category of Sales Enablement.
Sales Enablement Is more than just Technology
Sales Enablement can’t just be about technology and tools; it has to start with a defined, structured process that addresses buyer and customer engagement. This is a process that touches multiple stakeholders in a company: marketing, sales, and customer success. Tools and technology support this process. Sales Enablement is still in its early stages with new vendors emerging each month and experiencing growing pains getting their products to market and in deployment and adoption. My goal is to help vendors better understand their customers and to help buyers understand how to cut through the noise and hype so that they can successfully select the right solutions for their needs and utilize sales enablement in their environment.
For Sales Enablement Vendors
I am here to help sales enablement vendors to:
I am here to help enterprises looking to improve their sales processes to:
Most vendor landscapes I’ve seen lump pretty much any company who touches the sales, marketing, or customer success together. I see everything from point solutions to complete platforms lumped together. Some include CRM solutions. My goal will be to break this down to provide more clarity for both vendors and users. I’ll be monitoring the trends – new entrants, mergers and acquisitions, funding, and innovations that will benefit users.
Bottom, line, my goal is to prevent “shiny rock syndrome” along with the heartache and wasted time and money that goes with it.
On March 6, Cass announced that it has acquired Effective Telecoms Ltd. (Efftel), a privately-held UK-headquartered telecom expense management (TEM) company. The acquisition positions Cass to deliver telecom, facility management, and transportation services on a global scale, and for the European markets, makes Cass an immediate player. For Efftel, Cass’ leadership as a business process outsourcing vendor and its reach in verticals such as utilities, waste, facility management, and transportation, will provide Efftel customers with an expanded range of services and support.
With offices in both England and The Netherlands, Efftel will expand Cass’ reach in the European TEM market and add several key European accounts to its portfolio. Cass’ year of record-high earnings in 2016 positions the company well for continued growth and investment, particularly in the TEM space. Blue Hill notes that Cass reported fourth quarter 2016 growth from new customer wins, including accounts that migrated to Cass from competitors, and this growth is likely to have been driven significantly by Cass’ TEM business.
Efftel provides mobile and fixed telecom audit, procurement, and expense management services, as well as Mobile Management Services, including ordering, provisioning, helpdesk, financial management, and reporting and analytics for mobile devices. Efftel customers will be able to leverage the range of services offered by Cass, including sourcing, global bill payment, invoicing, auditing, negotiating and disputes, implementation, inventory and usage management, Business Intelligence (BI), reporting, and Managed Mobility Services (MMS).
Beyond TEM, Cass also supports categories such as transportation, facility management, utilities, and waste management. Additionally, Cass has a fairly broad partner channel including CompuCom and Lumenate for MMS and BYOD services, as well as Arrow Electronics for IT recycling capabilities.
Prior to the acquisition, Cass offered its full range of services in the US, Europe, Latin America, and Australia, and as of September 2016, had recently added audit and dispute, contract, billed inventory, and managed disputes to its managed services in the APAC (Asia/Pacific) region. David Rosenthal, founder and managing director at Efftel, will join Cass as managing director of European Services.
Blue Hill notes that TEM vendors active in the European and especially U.K. markets should be aware of Cass’ acquisition of Efftel as it positions Cass as an immediate competitor. From our coverage of the TEM market, Blue Hill has observed that Cass typically targets companies with around $20 million in annual telecom and mobility spend, with a significant portion of its busness coming from Fortune 500 companies. With its expanded presences in the U.K. and Europe through its acquisition of Efftel, Cass is poised to rapidly scale its TEM business in the European markets, especially for deals with a business process outsourcing or facilities management component.
Andy Vidan is the CEO of Cambridge, Massachusetts-based DataOps startup Composable Analytics. He founded the company two years ago with MIT colleague Lars Fiedler. They now lead Composable—self-funded and self-sustaining, by the way—and are establishing a beachhead in the nascent DataOps space. I recently spoke with him about the genesis of his company, what it’s like to (maybe) work with the U.S. DoD, and the challenge of evangelizing DataOps to line-of-business stakeholders.
TOPH WHITMORE: Tell me about Composable Analytics.
ANDY VIDAN: Composable Analytics grew out of a project at MIT’s Lincoln Laboratory. Lincoln Lab is an MIT R&D center that’s provides advanced technology solutions to the U.S. Department of Defense and intelligence community. There, we saw the clear need for a unifying platform that can ingest all types of data and feed it to an intelligence analyst. An intelligence analyst within the Department of Defense is similar to a business analyst within the private sector. They’re sophisticated. They know their subject matter well, better than software developers may ever know their business. But they’re not always technical, and when they have to deal with different data sets from different systems, with different formats and different structures, they must rely on software engineers and use a variety of disjoint tools that further complicate their workflows.
Our approach was different: We wanted to develop a single ecosystem to bring in data from all sorts of sources, and present it to the user for self-service data discovery and analytics. For us, Big Data always meant all data. Aside from the massive amounts of data —which the community already knows how to handle—or even the high Big Data velocity and throughput, we focused on the variability that comes with all data: There’s always tabular data, and tabular data, and more tabular data, but we also have to think about image files, text documents, PDFs, sound files, and so on. We also wanted to make data accessible to an end user who knows the subject matter but is not a technical person.
TW: You and Lars Fiedler developed Composable while working at Lincoln Lab. How did Composable evolve from an MIT idea into a commercial solution?
AV: Lincoln Laboratory is a well-kept secret.
TW: With the defense department involved, it probably has to be!
AV: Yes. MIT Lincoln Laboratory is really one of the premier research labs in the US, very much like the old Bell Labs, or the Jet Propulsion Lab that NASA runs with Cal Tech. Composable Analytics was initially funded directly by the DoD. The nice thing about Lincoln Lab is that you have that user interaction. You aren’t just writing research papers, you are prototyping, building systems, you are meeting with end users—in this case, intelligence analysts and operators—to be able to really get down to requirements and get a system that they would eventually use.
TW: Does Composable Analytics still serve the Department of Defense?
AV: Yeah. So I can’t really answer the question.
TW: Good enough!
AV: Our main focus is private sector.
TW: Tell me more about the Composable Analytics technology. What value propositions do you offer to an enterprise IT leader?
AV: Three things: orchestration, automation, and analytics. To me, that really embodies what’s behind DataOps. Our platform, our ecosystem provides those three things for an enterprise and for users of data within that enterprise.
Let me walk you through a real use case: One of our financial sector customers wants to build effective customer profiles. One touch point is their call center. You might call in to request a change of address after a recent real-estate purchase. This is normally a short call: the call center agent would change the address and hang up the phone and everybody’s happy. But this is a situation where an organization can learn more about the customer. An enterprise can use that little tidbit of information that you just revealed about yourself in order to understand what other products and services you might be interested in. The fact that you purchased a home might mean you’re willing to purchase life insurance. You might mention you are having a baby. That might incite you to open an educational savings account with the company. What does this require? Being able to integrate with a Voice-over-IP system and orchestrate a data flow that takes the call-center recording, in real time, pushes it into a speech-to-text engine, takes the resulting unstructured text and uses various analytics and natural language processing techniques in order to determine intent, sentiment, and trigger words that can then be directly inserted back into a CRM. The call center agent can see that on your profile and talk to you about it during that call, or next time you call. That embodies orchestration, automation, plus analytics. Those are the types of complex all-data flow use cases we’re addressing.
TW: It sounds like a platform play. Are you essentially offering and delivering and serving pretty much the whole data value chain from ingestion through consumption?
AV: Yes, we are, and that’s where DataOps comes into play. There’s always raw data out there. At the end of the day your business users are getting value from applications, Excel or Dynamics or Power BI or Salesforce or NetSuite, whatever it is. But there’s a whole process that happens in between the raw data getting to the high-level application, a process that encompasses orchestration, automation, and analytics. That’s our play. That’s where we live. That’s what we do well.
TW: I like to talk about the enterprise conflict between IT leadership and line-of-business stakeholders like my former marketer self. Toph-the-marketing-boy wants self-service everything—data immediacy without data-administration complexity. On the other side, IT leadership is tasked with ensuring auditability, lineage, governance, security. Which side of that customer equation do you target? IT side? Business influencer? Or both?
AV: Almost always the business side.
TW: Interesting. I confess that’s not what I expected!
AV: We typically find that the business side is willing to adopt new technologies so it can directly increase business value. Back to DataOps, we enable the business side to develop operational data science solutions, through reliable and robust continuous integration, while establishing, through the use of our tools, DataOps best practices. So, when the business side is ready to have IT leadership take ownership of its proven data implementations, we already have a layer of governance, security, and auditing around it, which makes the transition that much easier.
We talk about operationalizing data. In many cases, organizations have invested in PhD-level scientists to develop, implement, and validate data models. They do this by building what is normally a one-off analytic. It works beautifully, but at that point, the model has not provided any business value to the organization.
That one-off data model or data analytic must fit into a larger data workflow, one that the organization supports, and which works in conjunction with IT. It must integrate with production databases, query data, pull it into the analytic model, perform the computation, and push it back into other production databases, production CRMs, maybe into ERP systems. It’s that part—the data-workflow management—that is missing in today’s Big Data solutions. That’s where the Composable platform comes in. It allows you to connect the data sets, plug-and-play the analytics—that you either write or bring in from other open-source libraries—and be part of this broader operational process.
TW: You’re preaching to the converted! Enterprises need to hear the DataOps gospel. But I think most face a challenge on both the data consumption and data management sides of the house: They must overcome conflicting objectives to collaborate. Do you find that it’s difficult to evangelize collaboration to these enterprise groups?
AV: No. It’s actually easy once we’re in. When enterprises use our platform as a framework for building these operational data flows, we typically have good engagement with IT leaders because they see things are developed correctly.
TW: What’s deployment like?
AV: The platform is a distributed web application developed as a native cloud application. It can be deployed on the cloud, and scales well both horizontally and vertically. You can spin up an instance of Composable on AWS or Microsoft Azure, but the public cloud is not required. We can deploy Composable for an enterprise on-premises. Back to our Department of Defense legacy, one of our requirements was to be able to run not just on-premises, but on air-gapped networks, and we can do that. With some of our customers—within insurance and finance—the data is sensitive, and we run on a cluster behind the corporate firewall completely disconnected from the web.
TW: What’s Composable’s funding situation?
AV: We were lucky enough to leave MIT with a product and customers ready and waiting. From day one—the end of 2014—we’ve been completely client-funded.
TW: Will you look to subsidize growth with outside investment?
AV: Yes. I think 2017 is the year for us. We’re reaching a point where capital will help us scale out dramatically.
We’re a growing but small company, with the entire team being technical and focused on product development. As we grow, our focus will be to bring on forward-deployed engineers and customer success managers to help with deployment. This will help us approach a broader set of customers and work with them to develop a DataOps Strategy, based on a small-scale, short-term pilot, that may last one or two months at most. After that, and after they see the value, they buy into Composable as a licensed delivery platform.
TW: Where is your customer base?
AV: All regions, but predominantly domestic. We have, for example, one large customer that is a global energy conglomerate with operations in South America and other parts of the world.
TW: I understand you’re producing an upcoming conference?
AV: Yes—the DataOps Summit conference series. The next event is in June here in our hometown in Boston. We’re focused on getting all the data professionals into the same room. That’s both the business side of the house and technical audiences, like software developers, data scientists, data engineers, IT operations, quality assurance engineers, and so on. More details online at dataopssummit.com.
Many enterprises have invested in data science, and developed some cool data applications, and now must figure out how to put them in an operational workflow to actually generate value! That’s what we’re trying to illustrate with this DataOps Summit series. We’ll bring in executives from the business side—financial services, insurance, oil and gas, cybersecurity, other verticals as well—and talk about what DataOps tools, techniques, best practices they can put together around data operations. But we’ll listen, too: The technology vendors in the room—Composable and others—can work with them on a DataOps vision that we can all build towards.
TW: Where does Composable Analytics go from here?
AV: First, democratizing data science. Enterprise business users should be able to work more and more like data scientists. Our current end users are typically sophisticated business users, but not necessarily technical. Ultimately, they know the business better than anyone else. We’re creating a framework to help these users develop their own analytical workflows. Composable has a visual designer that lets you create complex dataflows regardless of your technical level. That means a complex data pipeline can be created visually, just as you would draw out a workflow on a whiteboard! We have a machine-learning computational framework behind this that will accelerate the process for an analyst to build these workflows. As that analyst selects different modules to build up the data flow, the machine will recommend the next such module to come in. So, machine learning is accelerating the development of new machine-learning data flows. That’s pretty cool.
Second, there’s a lot of noise out there, and we’ve seen many organizations delay data-management solution adoption. Composable started as a self-service analytics platform, but over time has become a DataOps platform with orchestration, automation, and analytics aimed at getting people out of the rat’s nest of spreadsheets, and to start thinking about modern data architectures. We see DataOps being this transformative notion of best practices that allow organizations to say “Okay, we can do this.” We know how to do software development. We know how to build production systems. Now, let’s bring that to the data world and start to think about production data platforms and operational data science.
Welcome to the first edition of This Week in DataOps! (And before you ask, no, it probably won’t come out every week.) For a reference point, think of “This Week in Baseball,” only the highlights are about data-derived value maximization. (Yes, that’s the hashtag: #dataderivedvaluemaximation. Lot of competition for that trademark, I bet.)
In this roundup: Two DataOps companies step into the light, two upcoming DataOps events take the stage, and a big DataOps buy signals a big DataOps player’s commitment to data governance transparency.
In news from BHR hq city Beantown, two new startups have taken up the mantra of DataOps. Composable Analytics, based across the Charles in Cambridge, grew out of a project at MIT’s Lincoln Laboratory. Cofounders Andy Vidan and Lars Fiedler started Composable back in 2014 with the aim of delivering orchestration, automation, and analytics, all within a DataOps context. Check out Andy’s lucid manifesto “Moving Forward with DataOps.” (I’m a big fan of DataOps manifestos, by the way.) Key takeaway: Real-time data flows, analytics delivered as a service, and composability are essential to DataOps success.
Another Boston-area firm is making news in the DataOps space. (New Cambridge, Massachusetts tourism slogan: Come for the craft beer. Stay for the data workflow management.) DataKitchen is the self-described “DataOps Company,” and delivers an algorithmic platform based on data “kitchens,” where enterprise data consumers create data “recipes” spanning data access, transformation, modeling, and visualization. And cofounders Christopher Berg and Gil Benghiat will be speaking on “Seven Steps to High-velocity Data Analytics with Dataops” at this month’s Strata + Hadoop World event in San Jose. (Apparently some of the steps are “shocking!” More details on that not-at-all-clickbaity preso here.)
Speaking of upcoming events, two feature a DataOps agenda. In June, head to…yep, Cambridge, Massachusetts for the DataOps Summit, a two-day show produced by the nice folks at Composable Analytics. Day one will focus on DataOps business use case and day two examines DataOps technical innovations. Speakers include Tamr CEO Andy Palmer, MIT Lincoln Lab researcher Vijay Gadepally, Unravel Data CTO Bala Venkatrao, IBM UrbanCode Deploy product manager Laurel Dickson-Bull, and chief technologist for PWC’s Global Data & Analytics practice Ritesh Ramesh. (Maybe don’t bring up the Oscars with Ritesh.)
And in late May, head to Phoenix for Data Platforms 2017. This year’s theme is “Engineering the Future with DataOps.” The show is sponsored by O’Reilly, Qubole, Amazon Web Services, and Oracle. Featured speakers include former Obama administration “Geek in Chief” R. David Edelman, Qubole CEO Ashish Thusoo, and Facebook engineering director Ravi Murthy.
And in case you missed it:
That’s it for now. See you next week in DataOps!
Note: To support questions from enterprise buyers and private investors that are looking at Telecom Expense Management and the greater Communications Lifecycle Management world, Blue Hill is starting a monthly review of the key announcements made in this space from companies including, but not limited to: 2-markets, 4telecomhelp, ACCOUNTabill, Advantix, AMI Strategies, Anatole, Asentinel, Avotus, Calero, Cass Information Systems (NASDAQ: CASS), Cimpl (formerly Etelesolv), Comview, EZwim, GSGTelco, IBM Global Services (NYSE: IBM), ICOMM, MDSL, mindWireless, MOBI, Mobichord, Mobile Solutions Services, MobilSense, MTS (NASDAQ: MTSL), Nebula, NetPlus, Network Control, One Source Communications, Softeligent, Tangoe (NASDAQ: TNGO), Telesoft, TNX, Valicom, vCom, Visage, and Vodafone Global Enterprise (NASDAQ: VOD).
Communications Lifecycle Management news items that have gotten Blue Hill’s attention in February 2017 include announcements from Calero, Cass, MTS, and vCom.
On February 13, Calero announced that its portfolio of VeraSMART wireline and wireless call accounting software will be sold through the Avaya DevConnect Select Product Program. VeraSMART allows users to combine, simplify, and share landline, cellular, and telecom data through dashboards and reports, and analyze voice, mobile, and unified communications (UC) data through Calero’s Insight Analytics solution. Avaya is a global provider of business communications software, systems, and services, including unified communications products.
Through the DevConnect Select Product Program, Avaya customers can integrate Calero’s call accounting, analytics, and reporting solutions with Avaya’s communications software and services to seamlessly combine and allocate wireless, wireline, and UC data, and generate reports and analysis on this data.
Blue Hill notes that the integration between Avaya and Calero is consistent with the trend we have observed for TEM vendors to increasingly integrate additional forms of enterprise communication, such as UC, into their management portfolios. Calero’s analytics and reporting capabilities will provide value to Avaya customers beyond simply expense management, and Avaya will provide Calero with new sales opportunities to both Avaya customers and through Avaya’s partner channel.
Cass Information Systems
On February 2, Cass Information Systems reported its fourth quarter and full year 2016 earnings, securing 2016 as the all-time highest earning year for the company. Strong fourth quarter performance contributed to the result, with Cass reporting fourth quarter earnings per share of $.57, up 8% from the same period in 2015. Net income for fiscal year 2016 was $24.3 million, up nearly 6% from 2015.
Much of this growth was attributed to Cass’ Facility Expense business, which includes its Energy, Telecom, and Waste divisions. Facility Expense saw a 1.6% YOY increase in Facility Expense dollar volume for the period, to $11.9 billion. In contrast, Cass’ Transportation dollar volume dropped 7.2% YOY to $22.8 billion.
For fourth quarter 2016 alone, Cass’ Facility Expense dollar volume increased 7.2% from the same quarter in 2015. The company attributed fourth quarter growth in its Facility Expense business to new customer wins, including several large accounts that migrated to Cass from competitors.
While Cass does not directly break out its Facility Expense revenues by division, Blue Hill notes that the growth in its Facility Expense vertical suggests that Cass’ Telecom business is experiencing strong performance and continuing growth. Cass’ record-breaking earnings are impressive, especially coming at a time when smaller, niche players continue to enter the TEM market, and market leadership positions are shifting. Considering the changing market dynamics in TEM, Blue Hill notes that Cass’ new Facility Expense account wins from competitors may have been in Telecom.
On February 6, MTS announced that its CEO, Orey Gilliam, has decided to leave the company effective April 30, 2017. Alon Mualem, MTS’ current CFO, will take over for Gilliam on an interim basis, effective immediately, while MTS searches for a replacement CEO. Gilliam served with the company for roughly 8 months, joining MTS on June 1, 2016. Prior to joining MTS, Gilliam served as CEO of ICQ, later bought by AOL, and became head of AOL’s Messaging business including both ICQ and AIM product lines.
Blue Hill previously covered the announcement, in which we speculated that Gilliam’s background in mobile- and internet-based startups may indicate that he was particularly interested in MTS’ video advertising business, which has become a core part of the company’s earnings since MTS acquired Vexigo Ltd. in April 2015. As of September 2016, video advertising accounted for nearly 50% of MTS’s revenue, but as the space becomes more competitive – particularly with entrants such as Snapchat whose core business is in video media – MTS may place a lesser priority on this vertical going forward, and conflicting prioritizations may have contributed to Gilliam’s departure, Blue Hill notes.
On February 1, vCom announced that it has achieved a customer satisfaction rating of 99%, breaking its previous records. vCom also reported that its customers have rated vCom as “meeting or exceeding their needs” for the tenth year in a row.
As more TEM vendors enter the mid-market, Blue Hill notes that a focus on providing strong customer satisfaction and support can serve as a key differentiator for vendors and can lead to additional mid-market deals.
Last December, Blue Hill published our Mid-Market TEM Vendor Landscape for 2016, in which we profiled vCom as a notable TEM player actively pursuing deals in the mid-market. As part of the profile, Blue Hill spoke with an active client of vCom’s who validated vCom’s long-term customer relationships as a key differentiator for the company. In particular, the client noted a “domino effect” in which vCom’s fair treatment of its employees translated to strong customer relationships.
On February 22, Axway, an enterprise data integration and engagement software and consulting services provider acquired Syncplicity, an enterprise file share and sync solution provider, in an all cash transaction. Terms of the deal were not disclosed. The combined company will create a single platform that enables applications to reference and transfer digital files stored within cloud-based or on premise content management systems, supporting the seamless exchange and synchronization of digital files.
Syncplicity will serve as a complement to Axway’s existing AMPLIFY platform for data integration, enterprise collaboration, and API lifecycle management. With AMPLIFY, clients can quickly develop and track applications across digital ecosystems, combining real time analytics and key metrics such as user engagement with tools to create API-based services. Axway currently supports 11,000 enterprise clients across 100 countries.
Syncplicity provides storage infrastructure, file sharing, document protection and backup, data migration, and collaboration tools through its solution for mobile and desktop devices. The platform is available on premise, off-premise, or for hybrid cloud environments. Syncplicity competes with Dropbox and Box, but offers a more secure solution by supporting a broader range of storage and Rights Management options. Syncplicity currently has over 25,000 customers, including both enterprise and individual accounts. Prior to the acquisition, Syncplicity was owned by global investment firm, Skyview Capital.
Enterprises are increasingly relying on cloud-based tools for a wide variety of business processes including collaboration, storage, computing, and analytics. As a result, remarked Jean-Marc Lazzari, CEO at Axway, “it’s imperative that file exchanges and synchronization between individuals is a secure and seamless experience.” Combined, Axway and Syncplicity aim to achieve this goal by creating a cloud-based platform capable of supporting high performance networking, file storage, and file transfer, as well as additional enterprise capabilities such as real time analysis, data integration, and user engagement tracking.
Based on Axway’s API framework, customers will be able to link applications and workflows with digital content stored in Syncplicity’s platform. For example, with the combined capabilities, a company can automatically share product documents with prospective buyers in its application, through APIs that locate reference documents stored with Syncplicity.
In April 2014, Blue Hill assessed Syncplicity, at the time owned by EMC, remarking that “although Syncplicity also started as a secure mobile-cloud content solution, it is increasingly being used as the mobile interface to directly connect to other [solutions]. This is an interesting direction that other companies in this space should consider.”
With Axway’s acquisition of Syncplicity, a mobile interface is created that directly connects Axway’s application management platform and API services with Syncplicity’s content storage solution, creating opportunities for additional content-driven workflows.
Syncplicity’s sync and share solutions allow enterprise to drive business value from data stored within applications, databases, or files, whether on premise or in the cloud. The additional capabilities from Axway will allow clients to link applications with referenceable digital files, and create additional workflows driven by API’s and built on stored content in the Syncplicity platform. Blue Hill notes that support for collaboration as well as data integration, API lifecycle management, and real time analytics and user engagement metrics positions Axway and Syncplicity as a single platform for file storage and in-application file transfer.
Blue Hill noted the importance of partners at IBM Connect. Although a wide variety of partners and related vendors were showcased, Blue Hill noticed two sets of technology vendors that were most relevant to our subscription base: highly prominent partners and interesting technology partners.
Cisco, Box, and Oblong were highly showcased as highly visible partners that got significant stage time at IBM Connect. Cisco Spark, their cloud-based unified communications solution, was highly visible, as were Cisco telepresence and WebEx solutions, as capabilities that were both integrated with IBM Connections and IBM Workspace. This partnership is important, as IBM has not deeply pursued voice solutions and the combination of IBM and Cisco collaboration solutions should be accretive in nature.
Oblong is a cutting-edge user interface company that created the UX concepts for Minority Report. At IBM Connect, Blue Hill noted how IBM and Oblong integrated to create environments where physical drawings and objects could quickly be integrated into the Workspace or Connections environment to cut down friction in remote collaboration.
But the star, from Blue Hill’s perspective, was the demonstration of Box Relay, a joint IBM/Box offering to provide Box-based workflows initially announced in September 2016. Blue Hill has noted the success of multiple software companies that have created custom workflows to solve core business issues while using Box as a cloud repository for documents. With Box Relay, IBM now has the ability to quickly create workflow-based applications across a wide variety of business use cases. As this occurs, Blue Hill expects that Docusign will be brought in from an authentication and authorization perspective. But this product should both be a strong standalone offering and the basis of Global Business Services offerings that can both scale for enterprise needs and be developed rapidly.
Blue Hill also noted seeing a number of other key partners including, but not limited to:
• Actiance – a social media compliance and archiving solution able to control a wide variety of communications channels through key governance and compliance policies.
• Genus Technologies – a video and digital asset management platform providing enterprises with the ability to support both traditional content and video on a horizontal basis.
• OnTime – a group calendaring solution that Blue Hill believes will greatly simplify the management of resource availability.
• Sennheiser – which demonstrated headsets with microphones designed to greatly reduce noise pollution in open office environments. This area will be increasingly important over the next year as speech-based app inputs become increasingly important.
• Trustsphere – an email, unified communications, and messaging Relationship analytics solution previously featured by Blue Hill Research
• Vidyo – a video conferencing solution and Platform as a Service that can be used to embed high-quality video conferencing into apps. At IBM Connect, Vidyo should an IoT-based demo that combined a scale, blood pressure measurements, and a remote doctor to support a future-facing view of remote medical treatment and checkups.
Blue Hill believes that this set of vendors will continue to help IBM in providing a future-facing view of collaboration designed to enhance human productivity. We recommend that Blue Hill’s community track of each of these solutions over the rest of the year.
At IBM Connect, Blue Hill had an interesting discussion with David Brooks, CTO of IBM Watson Work, who provided insights on how IBM is differentiating. As a CTO who has brought large-scale applications to market, Brooks seems to be a perfect fit for the likes of Facebook and Google, which are also known for their massive scale. In hearing about how IBM is reacting to this new paradigm of computing, Blue Hill heard how IBM was embracing open standards to accelerate business.
More interesting, Brooks spoke about his team was able to move from code to production in a matter of hours. This Fail Fast, Move Fast mentality is an interesting change for IBM compared to the IBM that many of us encountered earlier in our tech careers. However, this combination of open, API-based, rapid-code, and design-based culture is a necessary step forward both for IBM and other large organizations seeking to effectively compete with startups. The fact that IBM has taken this step forward in its Watson Work organization is an encouraging sign of developing true next-generation solutions.
Blue Hill also saw interesting innovations coming from the IBM Research Innovation Lab, which provided three especially interesting examples: a Network Data API (Visualization: Network Data API and Q&A Confusion Explore), Video Scene Detection (Video Scene Detection: Enriching Video Content), and a taxonomic demonstration of Skills within Your Enterprise (What Skills are within Your Enterprise?).
The Network Data API provides a networked visualization of questions related to each other. The example demonstrated was to show questions asked through the Watson Talent solution related to onboarding and benefits. By showing which questions were related to each other, the network data API provided guidance on when questions might be semantically similar or related. Blue Hill believes that networked or nodal analytics has been highly underused in the enterprise compared to the potential value and that this demonstration and similar networked visualizations should be closely tracked by analytics professionals seeking to understand the next key trends in seeing analytic relationships.
The Video Scene detection solution provided an automated solution for separating multi-scene videos based on color contrast. By itself, the solution provides a useful and direct method of separating scenes. But the real value will come from combining this capability with existing IBM Watson Video and analytic capabilities including facial recognition, automated transcription, and sentiment analysis to provide greater automated guidance into video. With this tool, IBM comes one step closer to independently categorizing and understanding video without direct human intervention.
But the most important research project that Blue Hill saw was the skills taxonomy created by IBM Research. By exploring the corpus of IBM Research documents, IBM was able to categorize the types of skills within their organization through a bottoms-up approach. This scientific method of understanding skills within the organization is going to have repercussions across recruiting, learning and development, resource management, and project management as companies gain the ability to use this bottoms-up approach and stop guessing whether a job or resource requisition is correct or not. IBM has created a research-based solution to eliminate this problem based on existing documentation. Blue Hill is eagerly anticipating the progression of this research project into IBM Kenexa/Watson Talent in the near future.
This set of conversations and demonstrations provided Blue Hill with significant guidance on the future direction of IBM Watson Work innovation and progress. Based on this event, Blue Hill anticipates future product announcements based on increased agility, taxonomic analysis, nodal relationship analytics, and video analytics in the near future.