Topics of Interest Archives: Visualization

Birst Announces Tech Partnership with Tableau? What's Going On?

On April 15, 2015, Birst announced a technology partnership with Tableau in which Birst’s BI platform will connect via ODBC to Tableau, allowing Birst users to directly connect Birst to Tableau, and for Tableau customers to directly connect to Birst. This may come as a bit of a surprise for customers who have previously considered both Birst and Tableau in competitive settings for business intelligence purchases, and for existing customers of both vendors, who may have considered both of these vendors to be similar in nature. So, why does this announcement make sense?

Although both of these companies have been lumped into the “BI Platform and Solution” bucket by a number of analyst firms, Blue Hill believes that these companies are actually quite different both in their focus and in their core value propositions. Typically, Blue Hill has found that when Birst and Tableau are head-to-head in a deal, the decision ends up being relatively straightforward because these vendors differ significantly in their strengths and weaknesses.

Although both companies are rising stars in their own regard, end users must understand how both of these vendors are actually quite different in their approach to supporting business intelligence. This technical partnership is a reflection of the fact that Birst and Tableau, although typically seen as BI competitors, can also be used in a single analytics environment. To understand how this works, consider how both Birst and Tableau have evolved over the past several years.

Birst and Two-Tier Data Architecture

Although Birst has its own visualization capabilities, as well as a predictive visualization wizard in Birst Visualizer, its greatest strength is actually as a bridge to bring together legacy data sources and data warehouses with emerging needs for new datamarts and cloud-based data into a single destination for corporate BI. This bridge occurs through Birst’s user-level data tier, which brings in a unique data view for each user within an organization based on the combination of internal and third-party data sources that an enterprise user may need.

See Related Research

By allowing each user to access and shape their own data environment based on individualized needs, and then serving as the intermediator to update enterprise data environments, Birst ends up being the key traffic director between individual data explorers and the enterprise truth. Because the social, mobile, and cloud technology paradigm has led to a splintering of data sources and data analysis, companies now must figure out how to put all the pieces together again. It’s a Humpty Dumpty problem where all the king’s horses and all the king’s men must put Humpty Dumpty together again, or risk losing basic visibility into key business processes. This is where Birst has a strong opportunity to support individual analytic choices and tie each individual’s actions to an enterprise environment.

Tableau and Data Discovery

Over the past few years, Tableau has instigated a new arms race in visualization where cloud BI vendors such as Birst and GoodData; standalone players such as Qlik, SAS, Information Builders, and MicroStrategy; and megavendors such as Oracle, IBM, Microsoft, and SAP all had to improve their visualizations and data discovery capabilities. The BI market can thank Tableau for creating a new standard that every other vendor had to match.

Now, companies doing their due diligence in BI typically know who Tableau is. The next step is for Tableau to increase the size and scale of its data discovery environments, which is most obvious through Tableau’s Drive methodology designed to “scale a culture of analytics.” This culture challenge has been one of the greatest contests in business intelligence. Although a fair number of enterprises have developed a BI center of excellence, and vendors have become increasingly flexible with their licensing approaches, the real difficulty to adoption has been the complexity of building true IT-business partnerships to support analytic environments. This is the predicament that Tableau has sought to tackle with the Drive approach and then to support with its own software and technology partners.

Birst and Tableau: Two Roads Diverged

This partnership reflects Birst and Tableau’s diverging paths in the enterprise BI world. Although both vendors will still find themselves competing against each other in specific deals, Blue Hill believes that this partnership is a good example of how each company is pursuing its strengths.

The reality is that datamarts and basic reporting have been around for decades. If the traditional methods of supporting these needs were good enough, in and of themselves, neither Birst nor Tableau would have ever taken off. But to reach the next level of integration with legacy enterprise environments, Birst and Tableau now can work together, at least on a technical level.

For business leaders, this partnership should help paint a clearer picture of their organization’s data analysis roadmap. The relative strengths of each solution illuminate the need for a combined approach. Ultimately, in order to ensure meaningful top-level data exploration, you must first ensure that the underlying data is trustworthy and complete. To build a full stack of data analysis capabilities, the choice was never really one or the other; it was combining the complimentary aspects of each. Now, with this partnership, the choice has become a little easier.

Posted in Analytics, Blog, Finance & Accounting, General Function, General Industry, General Management, IT & Infrastructure, Marketing, Research | Tagged , , | Leave a comment

The Business Analyst’s Next Frontier: Advanced Analytics

The Next Frontier in BI We have posited in prior blogs about the increasing ubiquity of features and functionality that were once differentiators in the BI and analytics space. That is to say, cloud, mobile, and self-service access to reports and dashboards has moved from the realm of unique competitive advantages to table-stakes.

While certainly there are battles yet to be won on pricing, user interface, extensibility, data management, customer service, and a host of other factors, one of the areas of most intense competition is happening around advanced analytics. The term “advanced analytics” itself can encompass a variety of undertakings from geospatial analysis, graph data analysis, or predictive modeling.

The upside of adopting such capabilities is that organizations have the opportunity to get a lot more out of their data than what they might from just reporting on it via dashboards. For instance, undertakings that improve forecasting, reduce customer churn, reduce fraud, or optimize production capacity all can have meaningful impact on a firm’s bottom-line.

In this case, the Holy Grail is figuring out how to give regular business analysts the ability to perform analysis that was traditionally reserved for specialized teams of data scientists. Just as desktop data-discovery tools and self-service BI helped to democratize access to data throughout the company, vendors are hoping to extend advanced analytics to a broader population of business users. The result has been an arms race as vendors have gone on a frenzy to build, buy, or partner their way into the conversation.

The constraint for many organizations has been the price tag. Even if the explicit cost of an advanced analytics solution is low through integration with something like the open source R language, the investment of time and personnel resources is often still significant. Simply put, advanced analytics expertise is expensive. Providing advanced analytics capabilities to business analysts provides value in two directions. For companies without prior investments in advanced analytics, they can now attainably perform basic forecasting and modeling that they otherwise could not. For companies already with investments and teams of experts, it means that lower-level and less-complex requests can be pushed down to business analysts, while data scientists are freed up to take on more complex challenges.

Business decision-makers evaluating their next BI and analytics investment should consider to what extent more advanced analytics capabilities are built in. Mega Vendors such as IBM, SAP, and Microsoft have responded by releasing freemium offerings that allow business analysts an accessible means to try their hand at these capabilities.

To this end, IBM Watson Analytics has taken an impressive leap in integrating forecasting into a visual interface that masks the complexity of the underlying SPSS functionality that is required to perform the analysis.  From a user experience perspective, Microsoft’s release of PowerBI takes a similar approach in that it integrates natural language processing so that users can ask questions of their data in a code-free environment. The cloud-based model and low cost also further extends the accessibility to business analysts.

In a similar vein, SAP Lumira is working to the convergence of data discovery and advanced analytics through continued integration of its Predictive Analysis suite. After acquiring KXEN in 2013, SAP has prioritized its assimilation into its overall analytics portfolio, the end goal (and important differentiator) being that a business analyst using the Lumira interface will have access to advanced analytical functionality with enterprise grade data governance of SAP and performance of the HANA platform.

Too Big Similar Button

Coming from a different angle, there are a few emerging advanced analytics players, such as Alpine Data Labs, Alteryx, and RapidMiner that are quickly introducing code-free environments with powerful capabilities. The blending, modeling, and automation capabilities of these companies holds up even at massive scales, making them important tools for augmenting the capabilities of data scientists and normal business analysts alike.

It is important to note that what we are talking about, in many of these cases, is an extension of the business analyst’s capabilities. We are broadening what can be accomplished if we take BI to its next logical extension. I’m not suggesting that these functionalities can displace truly advanced analysis across complex time series, data mining, multivariate environments, or Big Data sets from sources like machine sensors. Rather, businesses are demanding greater functionality on top of core BI & analytics capabilities and that the vendors who deliver on this will stand to gain.

There will be inevitable pushback as more advanced functionality gets placed in the hands of users that are not formally trained data scientists. Making business decisions on shaky analysis is a dangerous proposition, and no doubt many will wonder if giving functionality to those who don’t know how to use it might be akin to “giving them enough rope to hang themselves.” This is largely a paternalistic viewpoint that will taper off in the same way that similar fears about desktop data-discovery did.  Vendors will continue to build out guided user experiences to assuage these fears, and organizations will put into place safeguards to ensure the quality of underlying analysis. Ultimately, signs point to a future where barriers to advanced data analysis will continue to be lowered, and the capabilities of the data analyst will continue to expand.

Posted in Analytics, Blog, General, General Industry, Research | Tagged , , , , | Leave a comment

How SAP Lumira is Pushing Forward the New BI

SAP Lumira and the new BIThe world of business intelligence always seems to find new problems to address or new use cases to explore to drive the next generation of innovation. Since there are more than could reasonably fit into one blog post, I’d like to focus on two of the defining themes of the moment: data visualization, and self-service access to data.

I’ve written a lot on both of these areas recently, and it’s certainly easy to see what makes them so compelling. Visualization allows us to add a degree of design and expression to how we communicate findings from data in ways that simply aren’t possible without it. Self-service analytics solutions provide an opportunity for tremendous efficiency gains by allowing individual analysts to drill into data sets and ask questions on the fly without needing to work through a monolithic IT infrastructure.

These revolutions have been the fuel in the fire of more than a few upstart software providers. All are trying to improve the BI experience and fill gaps left unexplored by incumbent solutions.

SAP and the New BI

This is why the enthusiasm around SAP Lumira really caught my attention when I was down in Fort Worth, Texas for the annual ASUG/SAP Analytics and BusinessObjects Conference. For those of you who might not know, Lumira is a visual data discovery solution meant to provide individuals with self-service access to data. Lumira first came on the market in early 2013, and has been on an impressive release cycle ever since. SAP has taken seriously the idea of visual storytelling. One big area that underscores this point is Lumira’s ability to build infographics out of the data you are working on directly within Lumira itself without needing to export that data to a separate design tool.

A number of interesting use cases emerge from self-service visualization tools such as Lumira. These tools make it easy to do things like display store revenue information across map overlays. Visually depicting geographic distributions and other visualizations can help users spot trends that they might not have otherwise seen. I’ve seen and heard about a number of such use cases when watching demos and chatting with end users on the show floor as well as conducting my own (albeit more rudimentary) analysis with Lumira. Ultimately, the hope is that a better way to access and show data can lead to more informed and faster decisions.

Key to these trends of self-service and visualization is the shift of focus from an enterprise IT view to a view of the individual user. In the new world of BI, vendors focus first on winning the hearts of individuals, as their experiences are the proving ground for value and the basis of IT decisions. The desktop version of Lumira is available for individuals to download and use for free. This is an important point because it is a major rethinking in approach on SAP’s behalf. SAP’s go-to-market strategy with Lumira is an important signal that they have their eye on those individual users, and are equipped for the new age of BI.

Potential Road Blocks

But there is a caveat. Great visualization experiences and individual empowerment are fantastic, but in the world of enterprise IT, there is always the question of scalability. Watching the industry mature, as vendors work to deliver on their promises of better BI, has been … interesting.

There is a fine line between empowering individuals with access to data, and maintaining some level of control within the enterprise. If data is not defined and updated in a consistent way throughout a company, it opens the door for a number of challenges. Individuals performing the same analysis with different interpretations of the underlying data (or out-of-date data) may arrive at different conclusions. Some affectionately call this scenario “data anarchy,” while others have less flattering names. The challenge for organizations here is to maintain trust in the underlying data used in an analysis. When you lose trust in the data, your conclusions lose credibility, and even the prettiest charts are merely pretty charts.

“Governed data discovery” is how many vendors are tackling this issue. It is a balancing act along the tightrope of access and control. The past few months have a seen an emergence of governed data discovery within the marketplace, and I’ve looked at a number of vendors who have embraced this.

See Related Research

Looking Forward

It is an area in which SAP has the opportunity to build a significant competitive advantage. SAP’s experience of managing enterprise deployments in the world’s largest organizations no doubt guided them in building Lumira, as they developed a solution that is simultaneously individual-centric and enterprise-worthy. To this end, SAP’s Lumira Server looks to be a highly scalable option for companies trying to toe this line. Additionally, it is compatible with SAP BusinessObjects and Predictive Analysis, allowing for an option to fit into existing infrastructure. This should be a major point of consideration for IT decision makers.

All in all, SAP’s continued push for success with Lumira is an important validation of the major forces reshaping the enterprise experience of data. Look for a continuing convergence towards self-service access and visual storytelling. Both areas will be a major battleground for vendors, and SAP has made a strong entrance into the arena. In working to reshape the traditional conception of BI, SAP is making a compelling run in the enterprise space by focusing on the individual.

Posted in Analytics, Blog, General Function, General Industry, Research | Tagged , | Leave a comment

How Data Visualization Empowers Decision Making and Who Is Getting Us There

Data Visualization Decisions Cognition MatrixOrganizations continue to push data-driven decisions for greater amounts of their activities. As a result, the demand to understand meaning within the mountains of collected data is rapidly growing. The bottleneck in understanding all this data is the scarcity of individuals (read: data scientists, statisticians, etc…) who can use this data to glean meaningful insight. This has historically been a challenge for many organizations, but data visualization tools can help us bridge this gap.

Our brains, while inherently inferior to computers in the realm of computation, are comparatively phenomenal at pattern recognition. Additionally, they are hard-wired to respond to visual inputs. While we may not be able to look at a string of numbers and intuitively know the standard deviation of a sample, we are very good at picking up patterns from visual displays such as a scatter plot.

But let’s take this one step further. How does the human brain make complex decisions? How are business leaders able to combine statistical calculations, past experiences, and intuition to make impactful decisions such as creating new products, targeting certain markets, or pivoting business strategy? More importantly (for this blog at least), how can data visualization help us get to these outcomes, and what vendors are getting us there?

I’d like to introduce what we here at Blue Hill are calling “The Cognition MatrixTM”.

Cognition Matrix

At the most basic level, binary processing is simple calculation. For the business user, this might help us answer questions such as what was revenue this quarter? At a more complex level, binary processing leads us into the realm of statistical modeling or forecasting. Binary processing is where machines excel, and we rely heavily on their processing power to compliment our own decision-making processes.

But meaningful insights come from our ability to understand inputs within the context of prior experiences – in other words, from pattern recognition. On a basic level, pattern recognition manifests itself in instinct. Consider an example from the physical world: catching a fly ball. We could certainly run calculations based on velocity and flight path to find the correct place to stand, but instead we need only the visual cue of the ball in the air and our past experiences to instinctually know where to go.

Because of our brain’s disposition to visual cues, bridging the gap between binary processing and pattern recognition can often be achieved through some form of visualization. Simple charts and graphs help us see movements such as declining revenue performance, and complex visualizations can help us find trends that our brains are not otherwise able to identify from strings of data.

At the highest levels of decision-making, we must make challenging leaps of understanding in the face of nuance and uncertainty. We often rely on data and analysis in tandem with our intuition to guide our process. Making analysis more digestible to the human mind through data visualization is an important element in complementing our decision-making.

Cognition Matrix with Visualization

So which vendors are helping us get to cognition? They form an interesting mix, ranging from legacy players to younger visualization-first companies that provide compelling solutions. For the business user, the greatest value is achieved through self-service and analyst-centric solutions for data visualization.

Qlik, Tableau, and Yellowfin are, relatively speaking, the new kids on the block – although they can be credited with validating the self-service visualization movement. Each has been lauded for their innovations in self-service data-discovery and stunning visualizations, and they all feature their own version of a storyboarding application that emphasizes individual analysts’s ability to communicate findings.

MicroStrategy and Information Builders are among the largest and most established of the independent business intelligence vendors. While they originally cut their teeth on dashboarding and reporting, each has come to market with strong self-service visualization tools. Microstrategy’s Analytics Desktop and Information Builders’ InfoDiscovery are analyst centric tools that excel at finding insights within data.

Of the mega-vendors, SAP and IBM stand out for a number of reasons. Both SAP Lumira and IBM Watson Analytics are predicated on helping analysts tell stories with their data, and each company has invested heavily in creating impressive user interfaces. Lumira and Watson Analytics both offer free versions available from their respective company websites. This is an important signal indicating a departure from traditional strategies and how serious they are to making these products a success.

As visualization continues to be at the forefront of the data analysis discussion, expect to see a continuing convergence of players innovating in the space. Basic data visualization and self-service capabilities will no longer be the competitive differentiators they once were and vendors will continue to push the envelope of packaging analysis into decision driving stories.

If you’d like to dig deeper into our thoughts on bio-inspired and cognitive computing, then please read my post on business intelligence, data visualization, and the brain.

Later this month we will be releasing a more extensive report diving deeper into this issue. In addition to further examining how data visualization empowers business decision making, we will be highlighting more vendors and going into greater detail on each.

Posted in Analytics, Blog, General Function, General Industry, Research | Tagged , | Leave a comment

Blue Hill's Q4 Self-Service Analytics Research

Data Science Venn DiagramThere is a fundamental issue in the world of enterprise analytics and data management that is vital to the future of business intelligence and analytics: are employees free and able to pursue the deep analytical insights needed to further advance their business goals? The concept of the analytical business has become more popular in recent years as statistics and algorithms have become sexy concepts. One need only look at the concept of “Moneyball” to see that statistics are no longer relegated to the nerd squad. When Brad Pitt becomes the face of gaining analytic advantages in the workplace, analytics has arrived as a mainstream business topic.

But the popularity of analytics does not mean that it has been fully realized into a set of tools that are ubiquitous and easy to use. Although we have seen phenomenal strides in the tools made available to support business intelligence over the past five years, we are still largely in a world of haves and have-nots when it comes to analytic access.

Why is this? Part of the problem is that we as an industry are defining analytic freedom in different ways. A simple way to think of this is to consider the enterprise-wide view, the department-wide view, and the individual view.

Some of us look at this from a company-wide view, where analytic freedom means having agile data warehousing, robust ETL, a portfolio of analytic applications custom-made for each department, an army of number crunchers to handle each predictive request, and a fully-realized BI Center of Excellence.

Yet others look at the department-wide view, where the key is to provide each employee within a department with relevant data. For a marketing department, this might mean a 360-degree view of all campaigns, products, and customers. For a manufacturing department, this might mean full access to operational efficiencies, production, and Six Sigma efforts. These needs are often met in department-specific applications such as CRM and marketing automation management. But outside of the department’s purview, everybody else’s data problems are irrelevant. As a result, these department-specific solutions merely create a silo, where data-driven enlightenment is limited only to a specific few individuals and solely for certain tasks within a single department.

And finally, there is the individual’s need for data. There is the 1% of data analysts who are able to independently work with the vast majority of data sources, statistically analyze them, and find key connections that have previously escaped detection. We call them data scientists, and the only thing we truly know about this rare and prized species is that there is an enormous shortage of these individuals. But for the rest of us, vendors still need to catch up and provide a variety of tools that will give the typical knowledge worker the same access to data and analytics that the data analysts and data scientists have. This is no small task, as it requires transformative products to be developed in multiple areas: data cleansing, data management, business intelligence, predictive analytics, and performance management.

To make good decisions, individuals first have to find the correct data sources, and then make sure that the data is clean and reliable. This means going through everything: formal business data repositories, third-party data, collected survey and sensor data, informal spreadsheets and tallies, and more. In doing so, employees are often tasked with cleaning up the manual mistakes associated with data collection and collation. The subsequent task of data cleansing is estimated to take up three-quarters of a data analyst’s time. To reallocate this time to more valuable tasks, such as direct data analysis or business alignment of results with specific initiatives, companies need to take advantage of self-service and automated data management tools that solve basic problems in data management. This may include issues as mundane as changing “Y” to “Yes” or providing a default value for any null values in a column. Or this may include the automatic joining of fields in unrelated data sources that have never been linked before. As Blue Hill looks at data management, we plan to look at vendors ranging from market leaders such as IBM and Informatica to emerging startups such as Tamr, Trifacta, and Paxata to determine how each solution supports Blue Hill’s key stakeholders in technology, finance, and the line of business.

There has been a recent evolution in focused on self-service management of business intelligence. The vendors that have caught Blue Hill’s attention to the greatest extent in this regard include Adaptive Insights, Birst, GoodData, IBM Watson Analytics, Microsoft PowerPivot, Qlik Sense, SAP Lumira, Tableau, and Yellowfin. One of the most interesting aspects about this evolution is that end users may initially assume that the startup vendors mentioned would be less scalable, whereas the established enterprise vendors would be more difficult to use. However, this assumption is a false dichotomy; all of the leading vendors in this space, regardless of size, must scale and be easy to use. The key differentiations between these vendors tend to be more associated with the roles that they play within the enterprise and the extent to which they play into the Blue Hill Corporate Hierarchy of Needs.

See Related Research

Predictive analytics has been a more difficult area to innovate from a usability perspective. The biggest challenge has traditionally been the basic hurdle of statistical knowledge. For instance, Microsoft Excel has long had a statistical package that was sufficient to handle basic requests, but the vast majority of Excel users don’t know how to access or use it. Likewise, the statistical software giants, IBM SPSS and SAS, are easy enough to find in the academic world where students cut their teeth on statistical analysis. But for knowledge workers who were not number crunchers in their college days, this availability is (appropriately enough) academic compared to their day-to-day requests for sales projections, production forecasts, and budget estimates. Because of this, the drag-and-drop workflows of Alteryx, the natural language inputs of Watson Analytics, and the modelling ease of Rapidminer and SAP InfiniteInsight are going to become increasingly important as companies seek to change their companies from reactive monitors of data to predictive and cognitive analyzers of data-driven patterns.

Finally, Enterprise Performance Management represents an important subset of business intelligence focused on financial and operational planning. This is a core capability for any business planning. Small companies typically use spreadsheets to handle this analysis. However, as companies start including multi-currency, multi-country, complex supply chains, diverse tax structures, and even treasury activities, companies increasingly need a dedicated EPM solution that can be shared amongst multiple finance officers. At the same time, EPM needs to remain easy to use, or companies risk trading off the assurance of compliance with a delay of days or even weeks in supporting financial closes and budgeting activities. In light of this core challenge, Blue Hill is looking both at the offerings of large software vendors (such as Oracle, IBM, SAP, and Infor) as well as newer upstarts (such as Adaptive Insights, Host Analytics, Tidemark, and Tagetik) to see how they have worked to simplify the Enterprise Performance Management space.

These are the key research efforts that Blue Hill is going to pursue this quarter as we seek to understand the advancement of self-service in analytics, business intelligence, and data management. We are seeking the true differentiators that buyers can hang their hat on in 2014 and going into 2015 as they affect the financial, technological, and line-of-business managers: the three key stakeholders.

Posted in Analytics, Blog, General Function, General Industry, Research | Tagged , , , | Leave a comment

Latest Blog

Blue Hill Research Communications Lifecycle Management Highlights: May 2017 Blue Hill Research Communications Lifecycle Management Highlights: April 2017 This Week in DataOps: Rain, the Real World, and Another Manifesto (the Good Kind)

Topics of Interest

Blog

News

BI

Big Data

Cloud

Virtualization

Emerging Tech

Social Media

Microsoft

Unified Communications

GRC

Security

Supply Chain Finance

Procure-to-Pay

Order-to-Cash

Corporate Payments

Podcast

Risk Management

Legal Tech

Data Management

Visualization

Log Data

Business Intelligence

Predictive Analytics

Cognitive Computing

Wearable Tech

Salesforce

Sales Enablement

User Experience

User Interface

Private Equity

Recurring Revenue

ILTACON

Advanced Analytics

Machine Learning

IBM

IBM Interconnect

video platform

enterprise video

design thinking

enterprise applications

Tangoe

Managed Mobility Services

Strata

Hadoop World

DataOps

service desk

innovation

knowledge

design

usability

USER Applications

ROI

Time-to-Value

AI

Questioning Authority

Domo

Yellowfin

Nexla

DataKitchen

Iguazio

Trifacta

DataRobot

Informatica

Talend

Qubole

Pentaho

Attunity

Striim

Anodot

Tableau

IoT

fog computing

legacy IT

passwords

authentication

Switchboard Software

GoodData

Data Wrangling

Data Preparation

TWIDO

Information Builders

Analytics

Enterprise Performance Management

General Industry

Human Resources

Internet of Things

Legal

Mobility

Telecom Expense Management