VMware's Industry Analyst Day Highlights Drive Toward "Consumer Simple, Enterprise Secure"

Mobile_Device_Management

Blue Hill attended VMware’s EUC (End User Computing) Industry Analyst Day in Boston on Thursday June 15th. After a morning of presentations, we met in small groups with key VMware EUC executives including Noah Wasmer, SVP Mobile Products, Jason Roszak, Director Product Management, Shawn Bass, VP & CTO for EUC, Courtney Burry, Sr. Director Product Marketing (Desktop), Dave Grant, VP of EUC Product Marketing, and Sumit Dhawan, SVP & GM of EUC.

VMware’s mission of “consumer simple, enterprise secure” was apparent. Despite the relatively complex technical platform enhancements VMware has made – including broadened third party integrations, the inclusion of artificial intelligence, machine learning, and natural language processing, and a focus on analytics and security – the message overall was one of simplicity and centralization.

VMware aims to transition the majority of its customer base to Workspace ONE – a single, multi-use-case platform for business mobility, workforce mobility, and desktop mobility – by the end of 2017. VMware plans to wrap its Go-to-Market (GTM) integration in the next 3-6 months. Can EUC drive Workspace ONE adoption the rest of the year so that the majority of its customers are in hand? It remains to be seen, as Blue Hill notes that enterprises are at varying stages of device and application management.

Blue Hill has seen that many enterprises are either still fairly new to enterprise mobility (or outright laggards) and are still using early stage solutions such as simple Mobile Device Management (MDM) or Mobile Application Management (MAM) platforms, making the transition to Unified Endpoint Management (UEM) seem some ways away. It will be interesting to watch VMware over the next 6 months to see how this vision plays out.

Companies are beginning to recognize that the security and standardization benefits they achieve for their mobile fleets through an enterprise mobility platform should be applied to other corporate devices such as laptops and IoT equipment as well. Right now, with MDM or MAM solutions, enterprise mobile devices are more secure, standardized, and controlled than laptops or desktop equipment in many cases. VMware’s overarching vision is a move to UEM in which all enterprise IT equipment is managed under its Workspace ONE platform, including support for next generation IT such as wearables, sensors, and IoT equipment. This vision certainly makes sense for where IT and enterprise technology is headed, but how soon VMware will be able to transition its customers away from more familiar MDM and MAM applications remains to be seen.

From a competitive standpoint, VMware is seeing greater threats from players like Microsoft who are offering device and application management solutions built into existing product lines, and thus at a lower price point. However, VMware seeks to differentiate through the strength of its technology to justify the at times higher cost of its platform. Continued enhancements to the platform include support for natural language processing and artificial intelligence to automate much of the process of IT support and issue resolution. Additionally, VMware is expanding third party integration, such as with Salesforce, to offer more seamless workflows in-app, and broadening its cloud support from Azure to AWS and multi-cloud environments. With its acquisition of Apteligent, VMware is placing a much stronger emphasis on app analytics and app performance metrics to drive value.

Finally, relative to my own focus on TEM, an interesting note is where the future of TEM will be considering the moves players like VMware are making into UEM. TEM vendors that are not actively developing strategies around next generation IT, application management, IoT, and centralized, software-first offerings are already far behind the curve and offer limited value to the modern enterprise. The days of call accounting and bill pay as a significant source of IT value are coming to an end as enterprise technology investments scale and the need for a centralized, standardized, and managed device fleet that now includes remote and more complex devices becomes apparent.

Voice Continues to Dominate: Apple Releases HomePod, Broadens Siri's Reach

An_Apple_HomePod_speaker_

Siri has a new shape. At its WWDC on Monday, Apple announced several new product and software updates, as well as a continued focus on artificial intelligence (AI) and machine learning. One of Apple’s new products is HomePod – Apple’s take on the home hub/virtual assistant market currently served by leaders such as Amazon’s Echo and Google’s Google Home. Ultimately, device manufacturers are racing to become the dominant platform not only for your phone, but also in your home, car, and on your body as well. It was only a matter of time before Apple made more concrete moves into home hardware to address the broader consumer ecosystem that exists for virtual assistants and voice technology.

In October, I covered the release of the Google Home and Google Pixel – the company’s flagship smartphone – describing the announcement as the beginning of an AI “arms race” in which device manufacturers aim to make voice a dominant means of interacting with technology. I wrote recently about this topic further, describing some of the current tasks AI is optimized to perform, as well as where I see it headed (hint: voice is the new text).

Apple already had a presence in the home through HomeKit, which allows users to control smart appliances such as thermostats, lightbulbs, and security systems through their iPhone as a central hub. HomePod will now serve as a hub for HomeKit connected products, similar to how the Amazon Echo, for example, can be used to turn lights on and off, or adjust the heat. Other than serving as a smart home hub, HomePod can be used to play music (exclusively Apple Music, which may be a deal breaker to some), perform basic tasks such as setting a timer or providing weather, traffic, or news information, and answer questions through calculations or an Internet search.

For anyone who has been following the smart home/virtual assistant markets, Apple is a fairly late entrant with HomePod. But with the company’s range of products, from smartphones to smartwatches to HomeKit, and its investments in both artificial intelligence and machine learning, creating a home hub product makes logical sense, especially if that product makes use of Apple’s existing Siri voice interface. Siri has been a part of the iPhone since 2011, and Apple recently added Siri to its Mac line of computers as well.

Apple is positioning HomePod as primarily a speaker for playing music, differentiating it from virtual assistants like the Amazon Echo that offers slightly weaker hardware specs, and from speakers that do not allow for voice interaction or perform a wider range of tasks. Rather than for music consumption, the more likely reason someone would purchase the HomePod over the Echo or the Google Home is entrenchment in the Apple ecosystem. This brand loyalty is creating very distinct offerings in the smart home and virtual assistant markets that have limited compatibility with one another. For example, Apple wants you to use its music service, rather than Amazon’s or Google’s, with the HomePod. Once you buy into one brand, you’re pretty much stuck.

It’s no surprise that Apple made moves into the connected home and broadened Siri’s device reach. (It’s actually more of a surprise that Apple was so late in doing so.) What will be interesting to watch is both third-party and native integration being built out across all these platforms. Apple has an opportunity to create very seamless and intuitive experiences that transfer between devices (from phone, to watch, to speaker, to computer) and enable more meaningful technology interactions. That is one of the main sources of value in voice as a means of interacting with technology, as it can potentially provide an easier, more intuitive way to use a device. If Apple can open Siri up to additional third party apps through APIs, the value of the HomePod greatly increases. The strength of Apple’s device ecosystem gives the company a clear opportunity to focus on providing a cohesive, unified experience in which users can switch between devices seamlessly and maintain a continued interaction.

Right now, the tasks that Siri is equipped to perform are so narrow and unambiguous that the time savings from using voice versus manually interacting with the device are slim. Interestingly enough, Amazon just released the Echo Show, a digital assistant device that also has a touch screen. In my opinion, the inclusion of the touchscreen demonstrates how far voice interaction needs to come before it will be the dominant means of using a device, and really highlights the shortcomings that exist with current iterations of the technology. For most tasks, it’s just easier to use a screen.

In covering this market, my view is that Apple’s announcement is nothing revolutionary. But it does demonstrate the continued importance companies are placing on creating an ecosystem for their devices, and forging new touchpoints in a consumer’s life through availability in cars, homes, and on-person. I wrote back in December that voice will be the new text, and I still believe it. But it’s clear we are still some ways away from that. For now, I’m content to ask Siri to set a timer or play The Shins, while imagining the exciting new kinds of voice interactions we will be having in the near future.

IBM and Cisco Systems Team Up for Integrated Cybersecurity Solution, Services, and Threat Intelligence

The importance of interoperability and resource allocation to cybersecurity cannot be overstated. Protecting against today’s highly dynamic threat environment demands a concerted, collaborative effort, not a fractured or siloed method of handling a cybersecurity infrastructure.  Security needs to be baked into the infrastructure as a function of business agility and operations.  The proof in the pudding can be seen in the “2017 Security Capabilities Benchmark Study” from Cisco which found that 22 percent of organizations that had a cyberattack actually lost customers, and nearly one-third lost revenue.  And a recent Cisco survey of chief information security officers (CISOs) found that 65 percent use up to 50 different security products that do not integrate, challenging their overextended security teams to move with speed. This undermines analysts’ ability to proactively identify malicious activity and then unleash a largely orchestrated response to halt the attack and mitigate the damage to the business.

So when IBM and Cisco announced Wednesday of this week their agreement on the combined efforts for integrated cybersecurity, a number of people have sat up and taken notice. The announcement included new IBM QRadar integrations with Cisco security offerings, including an agreement that Cisco will build two new applications for the IBM Security App Exchange for Cisco Firepower and Cisco Threat Grid.

More than just sharing information, the IBM Cisco security partnership also has product integrations to help enable organizations to benefit from the joint capabilities of the two companies. Among the planned integrations are a pair of Cisco applications that will run on top of the IBM QRadar security platform. The applications will bring Cisco’s Threat Grid and AMP (Advanced Malware Protection) to IBM, enabling users to benefit from advanced analytics. IBM Resilient Incident Response Platform (IRP) will also be integrated with Cisco’s ThreatGrid platform to pull in indicators of compromise (IOC). The goal of this announced collaborative effort is to ensure that all the pieces of a highly integrated cybersecurity solution work together. Elements of this collaboration will feature security products designed for interoperability at all levels of the security stack, whether they come from IBM or Cisco.

There are three core elements of this new partnership. The first is an integrated threat defense across networks and the cloud. Cisco plans to build new applications delivered via the IBM Security App Exchange to help security teams detect and respond more effectively and quickly to threats.

The second core element is essential threat intelligence sharing between Cisco Talos and the IBM X-Force Exchange, with teams from each collaborating closely on security research.

The third core element is jointly delivered managed services. Specifically, the IBM Managed Security Services group will team up with Cisco to deliver new security infrastructure services aimed at reducing the IT complexity often associated with cybersecurity efforts. One of the first managed service offerings will target hybrid cloud environments, since customers are aggressively migrating security infrastructure to public and private cloud models.

We strongly look forward to the initial outcomes for the end user that this collaboration agreement will provide.

All About Automation

RobotatWork

Note: This blog is the fifth in a monthly co-authored series written by Charlotte O’Donnelly, Research Analyst at Blue Hill Research, and Matt Louden, Brand Journalist at MOBI. MOBI is a mobility management platform that enables enterprises to centralize, comprehend, and control their device ecosystems.

Lately, doom-and-gloom predictions about robot workers and how we’ll all be out of a job have dominated tech news headlines. While these stories are usually educational and entertaining to read, they’re not new.

In fact, workers have feared automation for nearly 300 years. Modern professionals share similar concerns with Industrial Revolution-era workers who protested the implementation of sewing machines, factory production, and steam engines, for example.

While today’s Artificial Intelligence (AI) and machine learning technologies have the potential to radically alter the way we work, don’t start boxing up your office just yet. Throughout history, new innovations have always created more jobs than they’ve eliminated: they’ve changed how we work, not whether we work.

Powered by People

While employees may initially view intelligent technology with suspicion, AI’s ability to make a task cheaper and quicker to complete increases the demand for skilled labor for tasks that are impossible to automate. Consider our Industrial Revolution examples above: after new technologies were implemented, a cloth weaver’s production increased 50-fold thanks to a 98% reduction in labor. This resulted in four times as many textile employees by 1900.

More recently, consider the retail banking industry. While ATMs reduced the average number of employees per branch early on, they also streamlined bank expenses. These cost savings were so dramatic that the number of urban bank branches rose by 43% from 1983 to 2004, more than making up for those initial job losses.

The goal of implementing automation isn’t to eliminate jobs; it’s to redefine and streamline them. While some workers may be required to learn new skills, they are overall much more likely to find jobs than lose them. A 20-year analysis of the American workforce found that employment grew significantly faster in occupations that relied on computers to enable employees to perform non-automated tasks more effectively.

Most experts believe large-scale job loss isn’t what enterprises should be worried about. According to the Organization for Economic Cooperation and Development, only nine percent of US jobs are at a high risk to be automated. What is cause for concern is that less than one tenth of a percent of the annual GDP is dedicated to helping people manage workplace changes—and this funding has declined over the last 30 years.

A Few Things to Remember

While each automated technology implementation is unique, there are a few universal problems your enterprise will likely need to troubleshoot. Here are five things to keep in mind when adopting AI and machine learning:

1.         You’re the Expert

Whether you’re building Robocop, R2D2, or Johnny 5, realize that even the most advanced automation system is just a tool. It’s your process—you’ve managed it for weeks, months, or years already. Make sure someone is constantly monitoring and managing the technology to ensure it’s satisfying your company’s needs.

2.         Actions Speak Louder Than Words

Employees are going to be afraid when they hear about AI and machine learning. Simply teaching them what these terms mean isn’t enough; if you want to get everyone on board, organizational leaders need to show a workforce how this technology has been used before and why it will make completing specific tasks easier.

3.         Fill Your Bandwagon From Top to Bottom

Even after explaining the advantages of automation, don’t expect complete employee buy-in right away. Workers need to see executives and IT leadership 100% behind a new technology before they’ll start to get comfortable with it.

4.         Open Minds and Doors at the Same Time

One thing’s inevitable—there’s going to be pushback. People seldom enjoy change, so prepare for this ahead of time and let your workforce know that they need to trust automation (even if it goes against intuition or traditional ways of doing business).

5.         Technology Can’t Do It Alone

Automated systems, AI, and machine learning speed up decision-making, but do not eliminate the need for human interaction. Technology is nothing more than a complement to your organization’s strategic thinking capabilities; humans must do the heavy lifting here.

As automation continues to grow in popularity, it’s important that companies embrace new technology for the right reasons. Your AI and machine learning capabilities will never actualize their potential if leveraged solely to replace employees.

Blue Hill Research Communications Lifecycle Management Highlights: May 2017

CLMH-Background

Note: To support questions from enterprise buyers and private investors that are looking at Telecom Expense Management and the greater Communications Lifecycle Management world, Blue Hill is starting a monthly review of the key announcements made in this space from companies including, but not limited to: 2-markets, 4telecomhelp, ACCOUNTabill, Advantix, AMI Strategies, Asentinel, Avotus, Calero, Cass Information Systems (NASDAQ: CASS), Cimpl (formerly Etelesolv), Comview, EZwim, GSGTelco, IBM Global Services (NYSE: IBM), ICOMM, MDSL, mindWireless, MOBI, Mobichord, Mobile Solutions Services, MobilSense, MTS (NASDAQ: MTSL), Nebula, NetPlus, Network Control, One Source Communications, Softeligent, Tangoe (NASDAQ: TNGO), Telesoft, TNX, Valicom, vCom, and Visage.

Communications Lifecycle Management news items that have gotten Blue Hill’s attention in May 2017 include announcements from Calero, MTS, and vCom.

Calero

Announcing the Release of Calero PINNACLE 7.0

On May 4, Calero announced the release of PINNACLE 7.0 – an update to the company’s on-premise and cloud-based Communications Lifecycle Management software that includes a new user interface and additional analytics capabilities.

The update was focused heavily around creating an easier to use interface with improved navigation, screen real estate, and a responsive design to adjust to various device sizes. Calero’s guided analytics update for PINNACLE 7.0 is aimed at enabling users to work smarter and faster through self-service capabilities and increased visibility, rather than the use of static reports and dashboards.

Blue Hill has been documenting the strategies of various TEM companies in our landscape reports and other analyst coverage, and we have noticed that there is a distinct trend for vendors to offer either full self-service analytics, or to provide reporting and analytics as a managed service in which the vendor generates reports for the client. Both approaches have their advantages and will appeal to different types of enterprises. Calero’s PINNACLE update addresses the trend for self-service by offering a guided analytics approach that enables the platform to be used by not only IT but also purchasing, procurement, mobility, and line of business managers across the organization.

MTS

MTS releases new version of eXsight for Unified Communications and Collaboration Management Solution

On May 3, MTS announced the release of a new version of its eXsight Unified Communications and Collaboration (UC&C) Management solution. The new release includes an updated user interface and expanded reporting, provisioning and control, and Business Intelligence (BI) capabilities. The platform is available either through a licensed or cloud hosted model.

The update seeks to provide an expanded view into business metrics (such as employee productivity, application adoption, helpdesk insights) and the ability to integrate with Human Resources, Finance, and IT systems. The platform enables ad hoc reporting that allows users to customize reports and automatically share them across departments, with access available via mobile, browser, or desktop.

eXsight allows clients to gain information instantly through a single sign-on screen, and to manage their UC&C needs including instant messaging, employee presence, application sharing, and file transfer.

Blue Hill has observed that enterprises are seeking to manage Unified Communications technologies within their TEM platforms. MTS’ eXsight platform features expanded capabilities in BI, reporting, and self-service, to enable more seamless and functional integration with UC and communications technologies within a platform equipped for remote as well as traditional desktop access.

MTS Announces First Quarter 2017 Financial Results

On May 11, MTS announced its first quarter 2017 earnings report. Revenues for Q1 2017 were $2.4 million, down from $3.3 million for the same period in 2016. However, telecom revenues increased to $1.8 million in the first quarter of 2017, compared to $1.6 million during first quarter 2016. The company’s Vexigo video advertising business reported revenues of $575,000 in Q1 2017, down from $1.7 million during the same period in 2016. Overall, the company reported a net loss of ($899,000) on a GAAP basis.

Commenting on the financial results, Chairman of the Board of MTS, Haim Mer, noted that the company’s telecom business “continues to be stable” and that the company has “signed two new TEM customers to long-term contracts.”

As of September 2016, video advertising accounted for nearly 50% of MTS’ business. The decline in the company’s Vexigo revenues may indicate that it will more heavily prioritize its telecom expense management division going forward (as evidenced by recent updates to and investments in the company’s TEM platform).

In February, upon the departure of MTS’ CEO, Orey Gilliam, Blue Hill noted that the video advertising space is becoming much more competitive, and hypothesized that MTS would place a lesser priority on this vertical due to both Gilliam’s departure and the difficulties of competing in the market. The growth in MTS’ telecom business despite an overall loss for the quarter demonstrates the continued opportunities that exist in the TEM market for both leading global players and mid-market vendors, especially those vendors that prepare for the next generation of TEM such as by integrating with UC technologies.

vCom

vCom Enhances its Mobility Solutions with ServiceNow Integration

On May 3, vCom announced the integration of ServiceNow, a leading provider of cloud-based IT service support management solutions, with its enterprise mobility platform. The integration will enable vCom to offer its customers a more direct and efficient means of handling IT support requests by utilizing ServiceNow to drive transparency, deliver faster service, and reduce costs. ServiceNow will integrate directly with vCom’s platform, enabling users to open requests directly without IT involvement.

Blue Hill has increasingly been documenting the transformation of IT, and the need for TEM vendors to offer managed services that automate some or all of the processes and policies associated with enterprise mobility. Enterprises are more often seeking TEM solutions that integrate with a broader range of IT assets and services, and ServiceNow has become one of the more popular TEM platform integrations. vCom is strengthening its ability to compete in the changing TEM space by offering an integration with ServiceNow to continue supporting not only TEM, but the broader space the company dubs IT management-as-a-service (ITMaaS).

Blue Hill Research Communications Lifecycle Management Highlights: April 2017

Note: To support questions from enterprise buyers and private investors that are looking at Telecom Expense Management and the greater Communications Lifecycle Management world, Blue Hill is starting a monthly review of the key announcements made in this space from companies including, but not limited to: 2-markets, 4telecomhelp, ACCOUNTabill, Advantix, AMI Strategies, Asentinel, Avotus, Calero, Cass Information Systems (NASDAQ: CASS), Cimpl (formerly Etelesolv), Comview, EZwim, GSGTelco, IBM Global Services (NYSE: IBM), ICOMM, MDSL, mindWireless, MOBI, Mobichord, Mobile Solutions Services, MobilSense, MTS (NASDAQ: MTSL), Nebula, NetPlus, Network Control, One Source Communications, Softeligent, Tangoe (NASDAQ: TNGO), Telesoft, TNX, Valicom, vCom, and Visage.

Communications Lifecycle Management news items that have gotten Blue Hill’s attention in April 2017 include announcements from Cass, Telesoft, and Tangoe.

Cass

First Quarter 2017 Earnings Up 10% at Cass Information Systems, Inc.

On April 27, Cass Information Systems announced its first quarter 2017 earnings. The company reported a 10% increase in earnings per diluted share compared to first quarter 2016, and increases in revenue and net income of 5% and 8%, respectively. Net income for the period was $6.3 million.

Most notably, Cass saw a 29% increase in transactions for its Facility Expense business, which includes its electricity, gas, waste, and telecom expense management divisions. Cass attributed much of this increase to new customer wins from several accounts that migrated from competitors.

In February, Blue Hill covered Cass’ fourth quarter and full year 2016 earnings report, in which the company announced a 7.2% increase in fourth quarter 2016 Facility Expense dollar volume compared to the same period in 2015. Continued growth in its Facility Expense business suggests growth in Cass’ telecom expense management division as well. Blue Hill theorizes that many of the new account wins likely came from customers in telecom, especially those that migrated from market leaders in the space.

Telesoft

Telesoft Appoints Experienced Technology Leader Charles Layne Chief Executive Officer

On April 6, Telesoft announced the appointment of Charles Layne as Chief Executive Officer and member of the Board of Directors. Thierry Zerbib, Co-Founder and CTO of Telesoft, will continue to lead the product and technology side of the business.

Prior to joining Telesoft, Layne served as President and CEO of Signature Technology Group, a Data Center Services provider, leading the company to triple its global revenue and employee base under his tenure. Layne has also held senior leadership positions at Microsoft and Insight Enterprises.

In January, Blue Hill wrote about additional executive changes at Telesoft including the appointments of Tamara Saunders and Don Luby as Chief Financial Officer and Senior Vice President of Sales, respectively, and Charlotte Yates to its Board of Directors.

The Telecom Expense Management market is clearly attracting attention from not only seasoned executives but also outside investors. Which leads me to my next point…

Tangoe

Marlin Equity Partners Enters into Agreement to Acquire Telecom Expense Management Leader Tangoe, Inc.

Perhaps the biggest change that has happened in TEM in recent years is Tangoe’s April 28th announcement that Marlin Equity Partners will acquire all outstanding shares of Tangoe for $6.50 per share in cash. Marlin Equity Partners plans to merge Tangoe with Asentinel, another market leader in TEM and one of Marlin’s portfolio companies, while maintaining the Tangoe brand name and CEO Jim Foy.

Will the combined company create a new global superpower? And is such a thing in TEM useful, or even possible? Blue Hill covers this and more, including the next generation of TEM – which we dub IT Enterprise Management (ITEM) – in our latest report.

This Week in DataOps: Rain, the Real World, and Another Manifesto (the Good Kind)

TWIDO logoAs the saying goes, April showers bring May flowers, unless you live in British Columbia, where April showers bring May showers, and let’s face it, the joke doesn’t work as well with June flowers and pilgrims.

It’s been a big week in the DataOps world. First off, if you missed it (or even if you didn’t and want to listen to it again—thanks, Mom), check out the recording of the joint webinar I did last week with Information Builders’ marketing VP Jake Freivald, “DataOps in the Real World.” We talked collaborative data orchestration (long hashtag), DataOps in healthcare, and fast-talkers. Some fun things you’ll learn:

  • Information Builders’ latest Omni-Gen release includes a unique, tiered-functionality offering of three different toolsets, including Integration, Data Quality, and MDM editions.
  • The Information Builders engagement with customer St. Luke’s University Health Network (a relationship I profiled here in an earlier DataOps research piece) was so successful that the two parties have collaborated to package the solution as a healthcare-vertical-targeted BI and analytics solution.
  • They can’t hear you if you knock your headset microphone away from your face.
  • No matter its relevance, “COMAECAL” is not a particularly marketable DataOps acronym. (Sing it with me, Collaborate! Orchestrate! Measure! Accelerate!…)

dataops_landing_890x200_1Qubole founders (and former Facebook infrastructure engineers, and Apache Hive co-developers) Ashish Thusoo and Joydeep Sen Sarma have just authored “Creating a Data-Driven Enterprise with DataOps.” The book—published by O’Reilly—evangelizes both DataOps corporate culture and platform. It also features case examples from the likes of eBay, Twitter, and Uber. Expect some promotion (!), presentations, and available copies at the upcoming Qubole-sponsored Data Platforms 2017 conference next month. (Check out my “Questioning Authority” DataOps interview with Qubole CEO Thusoo here.)

Also, in case you missed it, the big news last week was Infor’s acquisition of cloud BI and analytics developer Birst. The move is an interesting one, in part because it raises the profile of BI in an enterprise context: Infor offers ERP solutions, and now Birst BI tools will snap into that portfolio.

It’s still a work in progress but if you’re committed to DataOps like the folks at DataKitchen, check out the draft DataOps Manifesto developed by a consortium of DataOps leaders. (I’m a big fan of DataOps manifestos.) It’s a call to action for the DataOps-faithful, and a series of (evolving) DataOps principles.

Finally, I’m looking forward to the upcoming Talend Connect and Informatica World events in California. Find me and let’s talk DataOps ‘til we’re blue in the face. (Just kidding. I’ll stop at flushed pink.)

Network and Device Security in the Age of IoT

Note: This blog is the fourth in a monthly co-authored series written by Charlotte O’Donnelly, Research Associate at Blue Hill Research, and Matt Louden, Brand Journalist at MOBI. MOBI is a mobility management platform that enables enterprises to centralize, comprehend, and control their device ecosystems.

Transforming operations and ushering in a new age of security concerns and protocols that businesses will face going forward, IoT converts each business access point into a new potential data source, generating feedback that changes on a per-second basis. As such, the volume and granularity of this data makes it a highly valuable resource to enterprises and a clear target for nefarious activity. Unlike enterprise security of the past, IoT device and network security must keep pace with the rate of real-time data and thousands or millions of new enterprise access points that can potentially be compromised.

All it takes is a look at recent headlines about breaches at companies like Yahoo! and Target to realize business and consumer data is no longer safe from prying eyes, especially now that it’s largely stored and transmitted through the cloud. Security breaches aren’t just becoming more prevalent; their impact is becoming more serious. A major security breach could put a company out of business or destroy its brand reputation if customers, vendors, and partners lose trust in the organization’s ability to securely operate.

The threat of sensitive business information on unsecured mobile devices or wireless networks became a concern with the advent of enterprise mobility and machine-to-machine (M2M). With machine-to-machine (M2M) technologies, one machine communicates with another across an internal network via embedded hardware modules, making data much more localized. In the wider networks of IoT, this threat becomes even more pronounced as IoT data is shared with internal networks, in the cloud, and on devices. The sheer reach and volume of data generated makes IoT security an unprecedented challenge for businesses.

IoT devices also suffer from a lack of industry-wide security standards. In enterprise mobile technology, security largely takes place at the component level: manufacturer security at the device level, enterprise security at the software layer, and network security in the cloud. When IoT software is overlaid onto built-in device security, the same basic device now has two very different and distinct security profiles. That makes it challenging for enterprises to manage all program access points: the device, network, and data.

IoT device manufacturers need to work together to develop secure, universal architecture and code management standards. Unfortunately, this level of “coopetition” is a long way away. For now, enterprises are left to develop their own security standards, causing the number of data breaches to grow as companies navigate this new world of device security.

As IoT often involves investing to make currently owned devices and equipment smarter, eliciting the behavior change required to provide adequate security for IoT devices can be a challenge for organizations. For mobile devices, many companies address security and management challenges by working with a third-party Enterprise Mobility Management (EMM) vendor whose software provisions secure and standardized protocols to entire device inventories. By outsourcing these tasks, enterprises gain best-in-class solutions without incurring significant overhead or tying up scarce IT resources. Much as they did with mobile devices in the past, today’s EMM vendors are increasingly incorporating IoT devices into their platforms and building out industry best practices for this new technology.

In the future, organizations will incorporate all IT assets (mobility, M2M, cloud, IoT, and traditional legacy infrastructure) into a single management platform —in many cases through a third-party relationship. Like mobile device security, IoT will largely be driven by outside partners that have experience incorporating IoT devices into enterprise device management portfolios and security protocols.

To successfully accomplish this, IT will need to involve virtually every organizational decision-maker within telecom, procurement, and purchasing departments. An enterprise’s IT asset buyers have not traditionally been the same people setting up carrier accounts or paying the bills. By bringing together different departments, businesses can get closer to creating IoT standards that minimize the risk of security breaches and allow businesses to better compete in this new era: the Internet of Everything.

Data Wrangling, “Groups & Loops,” and Some Company Called Google: Questioning Authority with Trifacta CEO Adam Wilson

Adam Wilson, Trifacta CEOAs CEO of Trifacta, Adam Wilson is committed to developing the best in data-wrangling technology, and then of course, preaching its gospel. He and I spoke recently about Trifacta’s past, present, and future  (“groups and loops”), partnerships with companies you might have heard of, and how the enterprise data landscape is evolving (for the better).

TOPH WHITMORE: Tell me about Trifacta’s backstory. Where did it all begin?

ADAM WILSON: Trifacta was born of a joint research project between the University of California, Berkeley and Stanford. There was a distributed-computing professor at Cal that had been doing work in this area [data wrangling] for almost a decade, looking at the intersection of people, data, and computation. He got together with a human-computer interaction professor from Stanford who was trying to solve the complex problem of transformation and preparing data for analysis.

And they were joined by a Stanford PhD. student who had worked as a data scientist at Citadel on trading platform algorithms. He found he spent the majority of his time pushing data together, cleansing it, and refining it, as opposed to actually working on algorithms. He returned to Stanford to work with these professors to figure out how to eliminate the 80% of the pain that exists in these analytics problems by automating the coding or tooling, and making it more self service. The three of them worked together, and created a prototype called the Stanford Data Wrangler. Within six months, 30,000 people were using it, and they realized they had more than an academic research project. So they created a commercial entity and started delivery to customers like Pepsi, Pfizer, GoPro, RBS.

I joined two-and-a-half years ago to help with go-to-market. At the time, the question was how do we help people take data from raw to refined, get productive with that information quickly, and do so in a self-service manner? We focused on customer acquisition, and I’m pleased to say we now have more than 7000 companies using Trifacta technology. And customer use of Trifacta data-wrangling technology creates training data that improves our machine learning.

TW: How does machine learning show up in Trifacta? And what drove your investment in it?

AW: Historically, machine learning has been the exclusive purview of only the highly technical. But machine learning and artificial intelligence have been part of Trifacta since the beginning. There are two fundamental observations. First, every data set is not a new data set. There are things we can infer from the data itself. Whether it’s inferring data types or inferring joins, we can provide automated structuring in a straightforward manner.

Second, we learn from user behavior. As users interact with data, we can make recommendations based on that behavior. Based on our own analysis, we can recognize they are dealing with a specific kind of data and interacting with it in a particular way, and we can make a suggestion. They can choose that suggestion and get immediate feedback as to what the data would look like if they apply those suggested rules. That cuts down on iteration. The end users can make a quick decision, see what it looks like, and if they don’t like it, make a different decision. Over time, they build up intelligence that encapsulates all the rules they are applying to the data. And that becomes something they can share, reuse, and recycle.

It’s not just about individual productivity in getting to refined data. It’s about how end users can collectively leverage that across teams or an enterprise to help curate data at scale.

TW: The business value of the machine learning you’re describing…does that take Trifacta into sales conversations with business stakeholders? Or do you evangelize primarily to an IT operations audience?

AW: The winners in this market are going to be those who recognize that collaboration between those two enterprise roles is absolutely essential. In the past, you’ve seen people building technical tools for IT organizations, and who have lost track of who the end consumer is, and have not provided self service. Or, on the flip side, you’ve seen BI technologies that embed lightweight data tools, but in the end lose track of the fact that IT needs to be able to govern that information, curate that information, secure it, and ensure it’s leveraged across the organization.

From the beginning, Trifacta has been a strong advocate of a vendor-neutral data-wrangling layer that allows you to wrangle data from everything, and in many regards, allows people to change their minds. You may be using any storage or data-visualization technology, but you don’t’ want to feel locked into any one decision that you’re making. You always want to be able to transform your data so that it’s useful, regardless of where you might be storing or processing it, or how you might be visualizing it now. Wrangle once, use everywhere.

We have a large financial services customer that uses 136 different BI-reporting solutions. The idea that they can wrangle that data in 136 different ways with 136 different tools was surprising for them. We provide a linear way to wrangle that information, refine it, then publish it out through a number of different channels, all with a high degree of confidence that it’s correct, and with appropriate lineage and metadata tracking how the source data has changed.

TW: Trifacta has pursued a proactive alliance strategy. Tell me about the partnership with Alation. How do the two technologies complement each other?

AW: I’m excited about the partnership with Alation! We have joint customers together with Munich RE, Marketshare, BNSF, and a number of companies looking to combine cataloging with wrangling. The idea is, when the data gets integrated into the large-scale data lakes, the first step is let me inventory it, then let me create an enterprise data dictionary that makes discovery and finding assets easier. Then, let me refine that data, enrich it, and transform it into something that will drive my downstream analysis. It starts with getting that data-lake infrastructure in place, then bringing in the tooling to allow end users to make productive use of the data that’s in the data lake.

Our customers use many different BI and visualization tools like Qlik, MicroStrategy, or Tableau, and sometimes modeling or predictive analytics environments like DataRobot. The front-end technologies serve different types of data consumption, but the cataloging combined with the wrangling is complementary, and ensures you can operationalize your data lake and expose it to a broad set of users.

TW: You’ve also recently partnered with a little startup called Google. Tell me about that partnership, what it means to Trifacta, what it means to your customers?

AW: Our vision for the space has always been self service. That approach helps alleviate infrastructure friction. Any time we can help people get wrangling faster and spending more time with the data as opposed to configuring infrastructure, that’s a win. About a year ago, Google took a look at this market and recognized that—as more data lands on the Google Cloud Platform, and in particular, cloud storage—Google needed a way to help those customers get that data into BigQuery, and to leverage it with technology like TensorFlow that would help those customers accelerate the process of seeing value from the data in those environments.

Google did an exhaustive search, and they selected Trifacta as the Google data-preparation solution. We worked with Google to ensure scalability, and that included integrating with Google Dataflow, and authentication, and security infrastructure. Google will take us to market as “Google Cloud Dataprep,” under the Google brand, and sell it alongside and in combination with new Google cloud services. To my knowledge, it’s the first time that Google has OEM’ed a third-party technology as part of the Google Cloud Platform.

TW: I have to ask—since I’m speaking with the CEO—will Google buy Trifacta?

AW: A lot of the value in a solution like Trifacta is being the decoder ring for data. Our independence is an important part of where the value is in the company. The fact that Trifacta can gracefully interoperate with on-prem systems and cloud environments was important to Google in making the decision to standardize on Trifacta. There’s value in our independence, so for us, the exciting thing is not only having the Google seal of approval, but delivering a multitude of hybrid use cases. HSBC is a joint customer, and uses Google for risk and compliance management and financial reporting. Trifacta data-wrangling has become a critical capability for HSBC to leverage, particularly with regard to data governance. Regulations change, keeping up with them is a huge burden, but Trifacta gives HSBC the flexibility to wrangle its data—on-prem or in the cloud—and create value in that evolving regulatory environment.

I sometimes get asked about what the Google partnership means for exclusivity—Will Trifacta still work with AWS, and Microsoft Azure, and others? The answer is absolutely yes. We’ve had a leading cloud vendor really shape our cloud capabilities, and accelerate our cloud roadmap. But we’ve made sure that everything we’ve done can be leveraged elsewhere, in other cloud environments. It’s not just a hybrid world between cloud and on-prem, it’s a multi-cloud world. That was important to Google. Google has multi-cloud customers, and they need to be able to wrangle data in those environments as well.

TW: Very diplomatic answer! Where to next for Trifacta?

AW: Three things. The first two are “groups and loops.” We put effort into self service, governance, machine learning. Now we want to apply this to provide fundamentally better solutions for teams to work together, to collaborate more efficiently. We’ve only just scratched the surface, and in the next twelve months you’ll see innovation from Trifacta in what it means to collaboratively curate information, and then learn from collective intelligence. How do we crowd-source that curation? How do we share collective intelligence most efficiently? And how do you get organizational leverage across it?

As for “loops,” we’re looking at how we ensure that this collective intelligence can be reused, and operationalized to scale with ever-increasing efficiency. We see a tool-chain of data tools to be crafted that will essentially become the work bench for how modern knowledge-workers get productive and collaborate.

Third, Trifacta is looking at how we can embrace real-time data streaming, as more and more of the data is streamed into these environments.

This Week in DataOps: The Promotional Edition

TWIDO logoSpring has sprung (finally, though only briefly here in Canada), which means it’s webinar and publishing season! And that makes for a busy month in the DataOps world.

Join me and Information Builders VP of Marketing Jake Freivald Thursday, April 27 2017 for our webinar on “DataOps in the Real World: How Innovators are Reinventing Their Business Models with Smarter Data Management.” I’ll be providing an overview of DataOps—what it is, how it works, and why it matters—and presenting an interesting healthcare case example. (So far, only two slides include pictures of my head.) I’m looking forward to an enlightening discussion! Registration details and more information available here.

dataops_landing_890x200_1

Silos kill! Well, they at least hinder progress. Keep an eye out for my upcoming DataOps report “No More Silos: How DataOps Technologies Overcome Enterprise Data Isolationism.” (Tentative publication date = Friday, April 28, 2017.) The research looks at how data innovators leverage technologies from vendors like Informatica, Domo, Switchboard Software, Microsoft, Yellowfin, and GoodData to break down organizational, architectural, and process-based enterprise silos.

Here’s what the first page might just look like:

p1 - No More Silos