Tag Archives: teich communications

DataHero at the BBBT: A Startup Getting It Right

First, on a tangent not directly focused on the product: Thank you Chris Neumann, CEO or DataHero. After hearing presenters from multiple companies consistently use the wrong words over the last few months, you used both premise and premises in the appropriate places. Thanks!

As you might gather, Wednesday’s presentation at the BBBT was by DataHero. A fairly young company, less than three years old, DataHero is focused on “Delivering a self-service Cloud BI solution that enables enterprise and SMB users to analyze and visualize their SAAS-based data without IT.”

Self-service BI is what almost all the players, both new and mature companies, are trying to provide these days. This just means they’re another player in attempting to help business knowledge workers to connect to data, analyze it and gather useful and actionable information without heavy intervention by business analysts and IT.

Cloud is also where everyone’s moving since it has so many advantages to all areas of software. DataHero, as a small company, isn’t just in the Cloud. They’ve smartly decided to begin by focusing on public Cloud applications with accessible API’s.

While that initially simplifies things, the necessity to handle complexity still exists in that world. Mike Ferguson, another BBT member analyst, pointed out that many of his clients have multiple, customized Salesforce.com instances and that’s bringing the upgrade issues seen in on-premises systems into the Cloud world. Chris acknowledges that and understands the need to grow to handle the issue, but knows that at the current size of DataHero there’s enough of a market for an initially more focused solution.

A strategic issue comes up with the basic nature of the Cloud. Mr. Neumann mentioned Cloud being opposed to centralized data, but that’s not quite so. Depending on how Cloud systems are set up, they can help or hinder centralization of data. However, right now he is accurate in that most of the growth of Cloud is departmental in nature. It’s also further blurring the always fuzzy line between enterprise and SMB markets by providing applications that both groups can leverage.

Another area that shows thought in their growth strategy is entry into new market. Chris is clear that they dip their toes into an arena, check reactions, and if positive then try to partner with as many companies in the space as possible to maintain neutrality. That means they don’t get locked into the first vendor the first client wants to work with, regardless of market control, leaving flexibility for customers. Their partner page, though young, clearly shows that strategy in effect. That’s a good move and I wish more vendors would think that way.

Another key growth issue is data cleansing. Right now, DataHero does none, expecting that the source system provides that capability. However, as clients use more and more source systems, there’s a cleansing need to clarify data clashes from different systems. That’s something the team at DataHero says they’re aware of while, again, that’s future growth (no time frames, as per legal sanity…).

The demo was very interesting. The other founder, Jeff Zabel, has a strong history in designing interfaces for software in vehicles, meaning usability really matters. That can be seen with a very clear and simple interface. It is easy to use. However, as pointed out by many other companies, 80% of business data has a location component and many DataHero vendors are far ahead of them in the area of geospatial information. That’s a key area they’ll have to improve.

Summary

DataHero is a young company with a young product. The key is that they aren’t just looking at their cool product and customizing solely on first sales. They have thought through a clear growth strategy. The BI tool is clearly fully fledged for the market segment they’ve chosen for initial release and they have thought through their growth strategy in far more detail than I’ve seen in other vendors who have presented at the BBBT.

If they execute their vision, and I see no reason why they wouldn’t, the folks at DataHero have a bright future.

Splunk at BBBT: Messages Need to Evolve Too

Our presenters last Friday at the BBBT were Brett Sheppard and Manish Jiandani from Splunk. The company was founded on understanding machine data and the presentation was full of that phrase and focus. However, machine data has a specific meaning and that’s not what Splunk does today. They speak about operational intelligence but the message needs to bubble up and take over.

Splunk has been public since 2012 and has over 1200 employees, something not many people realize. They were founded in 2004 to address the growing amount of machine data and the main goal the presenters showed is to “Make machine data accessible, usable and valuable to everyone.”

However, their presentation focused on Splunk’s ability to access IVR (Interactive Voice Recorder) and twitter transcripts and that’s not machine data. When questioned, they pointed out that they don’t do semantic analysis but focus on the timestamp and other machine generated data to understand operational flow. Still, while you might stretch and call that machine data, they did display doing some very simple analytics on the occurrence of keywords in text and that’s not it.

It’s clear that Splunk has successfully moved past pure machine data into a more robust operational intelligence solution. However, being techies from the Bay Area, it seems they still have their focus on the technology and its origins. They’re now pulling information from sources other than just machines, but are primarily analyzing the context of that information. As Suzanne Hoffman (@revenuemaven), another BBBT member analyst, pointed out during the presentation, they’re focused on the metadata associated with operational data and how to use that metadata to better understand operational processes.

Their demo was typical, nothing great but all the pieces there. The visualizations are simple and clear while they claim to be accessible to BI vendors for better analytics. However, note that they have a proprietary database and provide access through ODBC and an API. Mileage may vary.

There was also a confusing message in the claim that they’re not optimized for structured data. Machine data is structured. While it often doesn’t have clear field boundaries, there’s a clear structure and simple parsing lets you know what the fields and data are in the stream. What they really mean is it’s not optimal for RDBMS data. They suggest that you integrate Splunk and relational data downstream via a BI tool. That makes sense, but again they need to clarify and expose that information in a better way.

And then there’s the messaging nit. While talking about business as my main focus, technology presented with the incorrect words jars the educated audience. Splunk is not the first company nor will it, sadly, be the last, to have people who are confused about the difference between “premise” and “premises.” However, usually it’s only one person in a presentation. The slides and both presenters showed a corporate confusion that leads me to the premise that they’re not aware of how to properly present the difference between Cloud and on-premises solutions.

Hunk: On the Hadoop Bandwagon

Another messaging issue was the repeated mention of Hunk without an explanation. Only later in the presentation, they focused on it. Hunk’s their product to put the Splunk Enterprise technology on a Hadoop database. Let me be clear, it’s not just accessing Hadoop information for analysis but moving the storage from their proprietary system to Hadoop.

This is a smart move and helps address those customers who are heavily invested in Hadoop and, at least at the presentation level, they have a strong message about having the same functionality as in their core product, just residing on a different technology.

Note that this is not just helping the customer, it helps Splunk scale their own database in order to reach a wider range of customers. It’s a smart business move.

Security, Call Centers and Changing the Focus

The focus of their business message and a large group of customer slides is, no surprise, on network security and call center performance. The ability to look at the large amount of data and provide analysis of security anomalies means that Splunk is in the Gartner Magic Quadrant for SIEM (Security Information and Event Management).

In addition, IVR was mentioned earlier. That combined with other call center data allows Splunk to provide information that helps companies better understand and improve call center effectiveness. It’s a nice bridge from pure machine data to a more full featured data analysis.

That difference was shown by what I thought was the most enlightening customer slide, one about Tesco. For my primarily US readers, Tesco is a major grocery chain, with divisions focused on everything from the corner market to supermarkets. They are headquartered in England, are the major player in Europe and the second largest retailer by profit after Walmart.

As described, Tesco began using Splunk to analyze network and website performance, focused on the purely machine data concerns for performance. As they saw the benefit of the product to more areas, they expanded to customer revenue, online shopping cart data and other higher level business functions for analysis and improvement.

Summary

Splunk is a robust and growing company focused on providing operational intelligence. Unfortunately, their messaging is lagging their business. They still focus on machine data as the core message because that was their technical and business focus in the last decade. I have no doubts that they’ll keep growing, but a better clarification of their strategy, priorities and messages will help a wider market more quickly understand their benefits.

TDWI Webinar – Preparing Data for Analytics with Liaison Technologies

Tuesday’s TDWI Webinar was titled “Preparing Data for Analytics,” but that’s not what it was about. It was still interesting, just misnamed. The focus was, as would be expected with Liaison being the sponsor, about how managing data in the Cloud can enhance the ability of some companies to support BI.

It started with Phillip Russom, an in-house TDWI analyst talking a bit about preparing data without ever using the words extraction and transformation. The most interesting point he made was an aside in the main presentation, but one that should be brought to the fore: What’s SMB?

Most of us think of SMB as Small-to-Medium sized Businesses. His point was that it’s really Small-to-Medium sized Budgets. Many folks involved in enterprise software understand that you often don’t sell an enterprise license up front. A software vendor will sell to a department or a business division with a smaller budget that needs to get something done. Then the tactic is to expand within the enterprise to build it into a major account. Mr. Russom makes the great point that the tactics to sell into the smaller groups in an enterprise are very similar to those uses to sell to smaller businesses, so maybe there are opportunities being left on the table by smaller vendors.

His other key point needs a webinar of its own. He mentioned that companies looking for Cloud analytics should “make sure Cloud solutions are optimized for data and not for applications.” It’s the data that’s important, how you get it and how you prepare it. That’s what has to be supported first, then applications can access the data. Sadly, he said that and moved on without any real details on what that preparation means. I’d like to see more details.

The main speaker was Alice Westerfield, Sr. VP of Enterprise Sales at Liason Technologies. Her main point followed Russom’s lead in by pushing that a good analytics platform requires moving from an application centric approach to a data centric one. No surprise, Liaison has one handy. Most importantly, it’s a Cloud approach, one they’ve been offering before Cloud became the buzzword.

Alice was brief but focused on their history of supporting integration in the Cloud. The four main benefits she mentioned were:

  • Data Integration
  • Data Transformation
  • Data Management
  • Data Security

That makes sense and we all know how important the last one, security, matters before people are willing to perform the first three on the Cloud. However, it’s the changing nature of the data game that means I want to focus on the first, data integration.

While Liaison talks about the benefits of leveraging their years of integration skills rather than reinventing or needing to take new integrations and install them in an on-premises solution, there’s another Cloud aspect that I think is critical. Most businesses use a mix of applications and many are already on the Cloud. Add to that the mobile nature of today’s generation of BI solutions which are provided in the Cloud. It makes sense for many SMBs to leverage that. Why take data from Cloud apps, move them on-premises and then move them back to the Cloud? Using a service such as Liaison simplifies and speeds the process of meshing data from within and outside of the firewall and then providing wide access to knowledge workers through the new BI interfaces.

For the foreseeable future, there will continue to be reasons for keeping data within the firewall, but for most data and most companies, a solution such as Liaison’s would seem to be an opportunity to quickly integrate data and share it as broadly as required.

Kalido: A solution in search of the message?

I had the fortune to see Kalido presentations twice in two days. First, was the Qlik road show event on Thursday and second was the Boulder BI Brain Trust call on Friday.

Kalido provides a streamlined way to create and manage data warehouses. The key seems to be a strong data modeler linked to the engine, providing a more graphical way to define processes and link that to the automatic creation of the physical data layers and data warehousing management. According to their case studies, the result is a significant savings in time to deploy and manage warehouses. As they pointed out in the BBBT presentation, the deployment savings is a clear and compelling argument but the longer term saving in ongoing operational costs is one they haven’t yet successfully attacked.

That ties in to the issue of their major message to the Qlik audience and on their web site: “No ETL!” As anyone who understands their technology knows, and as they pointed out in their BBBT presentation, ETL is one component of their solution. The presenter on Thursday tried to claim it’s not ETL, it’s ELT, because they use a temporary data store to more quickly extract information from operational systems, but that’s not going to cut it. ETL is still performed even if in a slightly different order. IT people will understand that and laugh at the claim while most BI business users won’t know what that means and the rest won’t care as it’s not a major concern to those people trying to get information out of the warehouse.

Operational costs matter to both IT and the business line managers. As many IT centers internally “bill” divisions for costs, that will still have an impact and matter to both sides more than a specious ETL message.

More importantly, the ability to change your business model and have it rapidly be reflected in the data warehouse is of strong value to the decision makers. The ability to eliminate 6-9 months of rework before a change is done only to see the changes now be out of date has a clear and compelling message for business decision makers. The ability to rapidly satisfy business users in changing markets while using less IT resources is valuable to the IT organization.

So why does the message seem to be missing a great market focus opportunity? One possible answer is found on their executive team page. Rather, it’s what’s not found. A company that wants to leave the startup phase and address a wider market would do well to emphasize marketing with the same importance as engineering and professional services. Products aren’t enough, you have to create messages that address what interest stakeholders. Kalido seems to have a very good product, but they aren’t yet able to create messages to address the wider market.