Tag Archives: bbbt

Teleran at BBBT: Great technology, again with the message…

The BBBT started off 2015 with yet another company with a great technology and far too simplistic strategy and message. It’s the old problem: Lots of folks come to the BBBT because their small companies are starting to get traction and they want wider exposure, but the management doesn’t really understand Moore’s Chasm so are still pitching to their early adopters rather than the larger market.

Friday’s presenter was Kevin Courtney, VP Business Solutions, Teleran. Back in the day, I evaluated a small technology company for acquisition of their technology and inclusion into Mercury Interactive’s testing suite. It was an SQL inspector that let our products see the transactions going between clients and servers to help improve performance testing.

Teleran has the same basics but has come much further in recent years. The company starts with the same technology but has layered great analytics on top in order to help companies understand database usage in order to optimize application and network performance. They’ve broken down the issues into three key areas of business concern:

  • Performance and value: How are queries being performed in order to minimize dead data transfer and increase the value of existing computing infrastructure.
  • Risk and Compliance: Understanding who is doing what with data in order to minimize risk and prove regulatory and contract compliance.
  • Modernization, migration.re-platforming: Understanding existing loads, transactions and queries in order to better prepare for upgrading to new technologies – both hardware and software.

In support of these capabilities, Kevin mentioned that they have 8 software patents. While my understanding of patent, trademark and copyright laws leads me to understand that software patents shouldn’t be legal, they are and the patents do show innovation in the field. Hopefully.

Mr. Courtney also did a good job giving stories that supported each of these areas. I’ll quickly describe my two favorite (I know, three bullets, but that’s life).

One example showed that value is more than just a dollar value. He described a financial trading house using Teleran to analyze the different technology and data usage patterns between their top and bottom performing agents, then used that analysis to provide training to the bottom tranche (yes, I did have to use that word while discussing finance) in order to improve their performance.

The second example combined performance and modernization. He described a company where there were seven unsanctioned data marts pulling full data sets from operational systems. That had a severely negative impact on performance throughout their infrastructure. The understanding of those systems allowed for planning to consolidate, upgrade and modernize their business intelligence infrastructure.

So what’s the issue?

They have a great product suite, but what about strategy? The discussion, with additional information from Chris Doolittle, VP Marketing, via phone, is that they have a system that isn’t cheap and they readily admit they have trouble proving their own value.

Take a look at the Teleran site. Download some case studies. What you see is lots and lots of discussion about the technology. However, even when they do discuss some of the great stories they told us, the business value is still buried in the text. They’re still selling to IT and not providing IT the clear information needed to convince the business users to write the checks.

What’s needed is the typical chasm move of turning things upside down. They need to overhaul their message. They need to boldly lead with the business value and discuss how it’s provided only after describing that value.

What’s also needed is something that will be even harder: Changing the product in synchronization with the message. The demonstration showed a product that has little thought put into the user interface. At one point, Kevin said, after going through four different tables, “if you take x, y and z, then you can see that…” Well, that needs to be clear in a business intelligence driven interface rather than having the pieces scattered around requiring additional information or thought to figure it out. It’s overcrowded, very tabular and dashboards aren’t really dashboards. They need to contract with or hire some UI experts to rethink their interface.

They do OEM Qlik, but there are two problems with that. It looks like they’re using a very old version and aren’t taking advantage of Qlik’s modern BI toolsets. Also, the window with the information has a completely Qlik title. It should read Teleran’s product powered by Qlik in order to keep context.

Summary

Teleran is another company with a great technology that needs to change in order to cross the chasm. Their advantage is that their space, performance analysis, is far less crowded than the database or BI end points. If they can clarify their products and messages, they can carve out a very nice chunk of the market.

Denodo at BBBT: Data Virtualization, an Important Niche

Data virtualization. What is it? A few companies have picked up the term and run with it, including last week’s BBBT presenter Denodo. The presentation team was Suresh Chandrasekaran, Sr. VP, North America, Paul Moxon, Sr. Director, Product Management & Solution Architecture, and Pablo Alvarez, Sales Engineer. Still, what I’ve not seen is a clear definition of the phrase. The Denodo team did a good job describing their successes and some features that help that, but they do avoiding a clear definition.

Data Virtualization

The companies doing data virtualization are working to create a virtual data structure where the logical definitions link back to disparate live systems instead of overlaying a single aggregated database of information. It’s the concept of a federated data warehouse from the 1990s, extended past the warehouse and now more functional because of technology improvements.

Data virtualization (and note that, sadly, I don’t create an acronym because DV is also data visualization and who needs the confusion. So more typing…) is sometimes thought of as a way to avoid data warehouses by people who hear about it at a high level, but as the Denodo team repeatedly pointed out, that’s not the case. Virtualization can simplify and speed some types of analysis, but the need for aggregated data stores isn’t going away.

The biggest problem with virtualization for everything is operational systems not being able to handle the performance hits of lots of queries. A second is that operational systems don’t typically track historical information needed for business analysis. Another is that very static data in multiple systems that’s accessed frequently can create an unnecessary load on today’s busier and busier networks. Consolidating information can simplify and speed access. Another is that change management becomes a major issue, with changes to one small system potentially causing changes to many systems and reports. There are others, but they in no way undermine the value that is virtualization.

As Pablo Alvarez discussed, virtualization and a warehouse can work well together to help companies blend data of different latencies, with virtualization bringing in dynamic data to mesh with historic and dimensional information to provide the big picture.

Denodo

Denodo seems to have a very good product for virtualization. However, as I keep pointing out when listening to the smaller companies, they haven’t yet meshed their high level ideas about virtualization and their products into a clear message. The supposed marketechture slide presented by Suresh Chandrasekaran was very technical, not strategic. Where he really made a point was in discussing what makes a Denodo pitch successful.

Mr. Chandrasekaran states that pure business intelligence (BI) sales are a weak pitch for data virtualization and that a broader data need is where the value is seen by IT. That makes absolute sense as the blend between BI and real-time is just starting and BI tends to look at longer latency data. It’s the firms that are accessing a lot of disparate systems for all types of productivity and business analysis past the focus on BI who want to get to those disparate systems as easily as possible. That’s Denodo’s sweet spot.

While their high level message isn’t yet clarified or meshed with markets and products, their product marketing seems to be right on track. They’ve created a very nicely scaled product

Denodo Express is free version of their platform. Paul Moxon stated that it’s fully functional, but it can’t be clustered, has a limitation of result set size and can’t access certain data sources. However, it’s a great way for prospects to look at the functionality of the product and to build a proof-of-concept. The other great idea is that Denodo gives Express users a fixed time pricing offer for enterprise licensing. While not providing numbers, Suresh stated that the offer was working well as an incentive for the freeware to not be shelfware, for prospects to test and move down the sales funnel. To be blunt, I think that’s a great model.

One area they know is a weakness is in services, both professional services and support. That’s always an issue with a rapidly growing company and it’s good to see Denodo acknowledge that and talk about how they’re working to mitigate issues. The team said there are plans to expand their capital base next year, and I’d expect a chunk of that investment to go towards this area.

The final thing I’ll note specifically about Denodo’s presentation is their customer slides. That section had success stories presented by the customers, their own views. That was a strong way to show customer buy in but a weak way to show clear value. Each slide was very different, many were overly complex and most didn’t clearly show the value they achieved. It’s nice, but customer stories need to be better formalized.

Data Virtualization as a Market

As pointed out above, in the description of virtualization, it’s a very valuable tool. The market question is simple: Is that enough? There have been plenty of tools that eventually became part of a larger market or a feature in a larger product offering. What about data virtualization?

As the Denodo team seems to admit, data virtualization isn’t a market that can stand on its own. It must integrate with other data access, storage and provisioning systems to provide a whole to companies looking to better understand and manage their businesses. When there’s a new point solution, a tool, partnerships always work well early in the market. Denodo is doing a good job with partners to provide a robust solution to companies; but at some point bigger players don’t want to partner but to provide a complete solution.

That means data virtualization companies are going to need to spread into other areas or be acquired. Suresh Chandrasekaran thinks that data virtualization is now at the tipping point of acceptance. In my book, given how fast the software industry, in general, and data infrastructure markets, in particular, grow and evolve, that leaves a few years of very focused growth before the serious acquisitions happen – though I wouldn’t be surprised if it starts sooner. That means companies need to be looking both at near term details and long term changes to the industry.

When I asked about long term strategy, I got the typical startup answer: They’re focused on internal growth rather than acquisition (either direction). That’s a good external message because folks who want a leading edge company want it clear that they’re using a leading edge company, but I hope the internal conversations at the CxO level aren’t avoiding acquisition. That’s not a failure, just a different version of success.

Summary

Denodo is a strong technical company focused on data virtualization in the short run. They have a very nicely scaled model from Denodo Express to their full product. They seem to understand their sweet spot within IT organizations. Given that, any large organization looking to get better access to disparate sources of data should talk with Denodo as part of their evaluation process.

My only questions are in marketing messages and whether or not Denodo be able to change from a technical sales to a higher level, clearer vision that will help them cross the chasm. If not, I don’t think their product is going anywhere, someone will acquire them. Regardless, Denodo seems to be a strong choice to look at to address data access and integration issues.

Data virtualization is an important niche, the questions remain as to how large is the niche and how long it will remain independent.

Revolution Analytics at BBBT: Vision and products for R need to mesh

Revolution Analytics presented to the BBBT last Friday. The company is focused on R with a stated corporate vision of “R: The De-facto standard for enterprise predictive analytics .” Bill Jacobs, VP, Product Marketing, did most of the talking while Steve Belcher, Sales Engineer, gave a presentation.

For those of you unfamiliar with R as anything other than a letter smack between Q and S, R is an open source programming language for statistics and analytics. The Wikipedia article on R points out it’s a combination of Scheme and S. As someone who programmed in Scheme many years ago, the code fragments I saw didn’t look like it but I did smile at the evolution. At the same time, the first thing I said when I saw Revolution’s interactive development environment (IDE) was that it reminded me of EMACS, only slightly more advanced in thirty years. The same wiki page referenced earlier also said that R is a GNU project, so now I know why.

Bill Jacobs was yet another vendor presenter who has mentioned his company realized that the growth of the Internet of Things (IOT) means a data explosion that leaves what is currently misnamed as big data in the dust as far as data volumes. He says Revolution wants to ensure that companies are able to effectively analyze IOT and other information and that his company’s R is the way to do so.

Revolution Analytics is following in the footsteps of many companies which have commercialized freeware over the years, including Sun with Unix and Red Hat with Linux. Open source software has some advantages, but corporate IT and business users require services including support, maintenance, training and more. Companies which can address those needs can build a strong business and Revolution is trying to do so with R.

GUI As Indicative Of Other Issues

I mentioned the GUI earlier. It is very simple and still aimed at very technical users, people doing heavy programming and who understand detailed statistics. I asked why and was told that they felt that was their audience. However, Bill had earlier talked about analytics moving forward from the data priests to business analysts and end users. That’s a dichotomy. The expressed movement is a reason for their vision and mission, but their product doesn’t seem to support that mission.

Even worse was the response when I pointed out that I’d worked on the Apple Macintosh before and after MPW was released and had worked at Gupta when it was the first 4GL language on the Windows platform. I received as long winded answer as to why going to a better and easier to use GUI wasn’t in the plans. Then Mr. Jacobs mentioned something to the effect of “You mentioned companies earlier and they don’t exist anymore.” Well, let’s forget for a minute that Gupta began a market, others such as Powersoft did well too for years, and then Microsoft came out with its Visual products to control the market but that there were many good years for other firms and the products are still there. Let’s focus on wondering when Apple ceased to exist.

It’s one thing to talk about a bigger market message in the higher points of a business presentation. It’s another, very different, thing to ensure that your vision runs through the entire company and product offering.

Along with the Vision mentioned above, Revolution Analytics presents a corporate mission to “Drive enterprise adoption of R by providing enhanced R products tailored to meet enterprise challenges.” Enterprise adoption will be hindered until the products reflect an ability to work for more than specialist programmers but can address a wider enterprise audience.

Part of the problem seems to be shown in the graphic below.

Revolution Analytics tech view of today

Revolution deserves credit for accurately representing the current BI space in snapshot. The problem is that it is a snapshot of today and there wasn’t an indication that the company understands how rapidly things change. Five to ten years ago, the middle column was the left column. Even today there’s a very technical need for the people who link the data to those products in order to begin analysis. In the same way, much of what is in the right column was in the middle. In only a few years, the left column will be in the middle and the middle will be on the right.

Software evolves rapidly, far more rapidly that physical manufacturing industries. Again, in order to address their enterprise mission, Revolution Analytics’ management is going to have to address what’s needed to move towards the right columns that mean an enterprise adoption.

Enterprise Scalability: A Good Start

One thing they’ve done very well is to build out the product suite to attract different sized businesses, individual departments and others with a scaled product suite to attract a wider audience.

Revolution Analytics product suite

Revolution Analytics product suite

They seem to have done a good job of providing a layered approach from free use of open source to enterprise weight support. Any interested person should talk with them about the full details.

Summary

R is a very useful analytical tool and Revolution Analytics is working hard to provide business with the ability to use R in ways that help leverage the technology. They’re working hard to support groups who want pure free open source and others who want true enterprise support in the way other open source companies have succeeded in previous decades.

Their tool does seem powerful, but it is still clearly and admittedly targeted at the very technical user, the data priests.

Revolution Analytics seems to have a start to a good corporate mission and I think they know where they want to end up. The problems is that they haven’t yet created a strategy that will get them to meet their vision and mission.

If you are interested in using R to perform complex analysis, you need to talk to Revolution Analytics. They are strong in the present. Just be aware that you will have to help nudge them into the future.

WhereScape at BBBT: Another Intriguing Product Without a Clear Message

Last Friday’s BBBT presentation was by Michael Whitehead, CEO, WhereScape. The company seems to have a very interesting and useful product, but there’s a huge communications gap that needs to be addressed.

What They Do

One marketing issue to start was that I got most of this section from my own experience and WhereScape’s web site, not from Michael’s presentation. When someone begins a presentation by proudly announcing it is ““guaranteed there’s no corporate marketing in the presentation at all” while you’re presenting to a group of analysts, there’s a disconnect and it shows.

WhereScape has two products, Red and 3D, to help build and maintain data structures. The message is focused on data warehouses, but I’ll discuss that more in the next section. One issue was that their demonstration didn’t work as there seemed to be a problem connecting between their tablet and the BBBT display system, so much of what I’m saying is theory rather than anything demonstrated.

Red is their tool to build data warehouses. Other tools exist and have been around for decades, Informatica being just one competing firm.

3D is where the differentiation comes in. Everyone in IT understands that nightmare that is upgrading major software installations such as ERP, CRM and EDW systems. Even migrating from one version to the next of a single vendor can involve months of planning, testing and building, followed by more months of parallel runs to be safe. A better way of analyzing and modifying data structures that can compress the time frame can have a large positive impact upon a corporation. That’s what WhereScape is attempting.

What They Say

However, their message is all “Automation! Automation! Automation!” and the short part of the demo that worked showed some automated analysis but a lot of clicks necessary to accomplish the task. From what I saw, it will definitely speed up the tasks, if as advertised, with clear time and money savings, but it’s not as automated as implied and I think a better message is needed.

In addition, their message is focused on data warehouses while Michael said “We’re in the automation business not the data warehouse business,” which really doesn’t say anything.

Michael did talk for a bit about the bigger data picture that includes data warehouses as part of the full solution, but again there’s no clear message. While saying that he doesn’t like the term Data Lake, he’s another that can’t admit that it’s just the ODS. There’s also a discussion of the logical data warehouse, also not something new.

One critical and important thing Mr. Whitehead mentioned was something I’ve heard from a few people recently, the point that Hadoop and other “unstructured databases” aren’t really unstructured, they support late binding, the ability to not have to define a structure a priori but to get the data and then understand and define a useable structure for analysis.

What They Need to Say

This is the tough one and not something I’m going to solve in a short column. The company is targeting a sweet spot. Data access has exploded and that includes EDW’s not going away, the misnamed concept of Big Data and much more. Many products have been created to build databases to manage that data but the business intelligence industry is still in the place packaged, back-end systems were in the 1990s. Building is easier than maintaining and upgrading. A firm that can help IT manage those tasks in an efficient, affordable and accurate way will do well.

WhereScape seems to be aimed at that. However, their existing two-fold focus on automation and data warehousing is wrong. First, it doesn’t seem all that automated yet and, even if it was, automation is the tool rather than the benefit. They need to focus on the ROI that the automation presents IT. Second, from what was discussed the application has wider applicability than just EDW’s. It can address data management issues for a wider area of business intelligence sources and the message needs to include that.

Summary

Though the presentation was very disjointed, WhereScape seems to have focused on a clearly relevant and necessary niche in the market: How to better maintain and upgrade the major data sources needed to gain business understanding.

Right now, while there is a marketing staff at the company, WhereScape’s message seems to be solely coming from the co-founder and CEO. While that was ok in the very early days, they have some good customer stories, having led with Tesco’s success in this presentation, and it’s time to leverage a stronger and clearer core message to the market.

Where the issue seems to be is the problem I’ve repeatedly seen about messaging. The speed of the industry has increased and business intelligence is, on a whole, crossing Jeffrey Moore’s chasm. That means even younger firms need to transition from a startup, technically focused, message to a broader one much more rapidly than vendors needed to do so in the past.

While WhereScape has what seems to be the strong underpinnings of a successful product, they need to do some seriously brainstorming in order to clarify and incorporate a business oriented messaged throughout their communications channels – including in presentations by founders.

Qlik Sense at the BBBT: Setting Up for the Future

Qlik was at the BBBT last week to talk about Qlik Sense. The presenters were Josh Good, Director of Product Marketing, and Donald Farmer, VP of Innovation and Design. It was a good presentation and Qlik Sense seems like the start of a good product, but let me start by discussing a tangent.

A startup’s voice: A marketing tangent

Startups usually have a single voice, the founder, CTO or somebody who is the single and sole owner of the vision. Sometimes it’s somebody who is put forward as the visionary, correctly or incorrectly. It takes a level of maturity in a company to clarify a core message to the level where it’s replicable by a wider variety of people and for the original spokespeople to let go. While the modern BI industry is still fairly young and every analyst group talks about the untapped market, Qlik is one of the biggest players in our nascent business.

Donald Farmer is a great presenter, a smart man and has been, until recently, the sole Qlik voice I hear in every presentation. While I don’t always agree with him, he’s a pleasure to hear. Yet I continually thought “why him, always?” There might be somebody else briefly doing a demo, but he was THE voice of Qlik.

It’s not only because of my product marketing experience that I was pleased to hear from Josh. He wasn’t the demo dolly, but let the presentation with Donald chiming in. They worked well together. It’s clear that both of the startup issues I mentioned are being addressed by a maturing Qlik marketing organization who are now using multiple voices well.

Qlik Sense

I’ve blogged about other companies recently, talking about the focus on UI. Thankfully, it’s spreading. Companies who focused, in the early days, on the business analysts are realizing that they need to better address the business knowledge worker. Qlik Sense has a nice, clean interface. It’s nowhere near the overcrowded confusion of most products from a few years back. For those who want to see it, the client software is freely downloadable to you can try it out.

The one issue I have is, again, the same one I’ve mentioned with many other vendors: ETL. Josh was another person who started the demo by importing a spreadsheet. Yes, I know there’s a lot of data in them and all products need to access spreadsheets, but it’s one way of avoiding the ETL issue. Other than very basic, departmental data, more complex decision making always involves other sources. It’s the heterogeneity of data that is today’s big issue. However, that’s a weak spot hidden by just about everyone.

What was nice was the software’s intelligence in building an initial data relationship diagram base on field name relationships. It’s a start and if they keep at it the feature can grow to something that can more easily show the business user the links between different pieces of information.

A number of vendors have recently begun to have their software look at data and propose initial visualizations based on data type. It’s an easy way for users to get going. Qlik Sense doesn’t do that and the response was marketing fluff, but the display to choose types is better than most. Rather than drop down to select charts, it displays the types with mini-images. That will do for now.

Mobile done well

One fantastic part of the demo was in how well they’ve integrated mobile into the system. They were going to show it anyway, but before Josh could get to it there was a problem with his PC. He quickly pulled up his iPad and, using the same account, continued on his way with the same information that was well formatted to the new display. A key point to that is that Qlik isn’t just using mobile devices for display, he was working to create visualizations on the device.

That other data…

I’ve already mentioned heterogeneity. A number of younger companies, focused on the Cloud, have created clear links to Salesforce and other cloud data sources to easily let SMB and departments access those data sources. Qlik does not have that capability, furthermore access to major ERP and CRM systems. That will still take strong interactions with IT to create links and access for the users.

That matters to me, for one example, because of the repeated demo examples from the sales arena. Yes, sales managers remain heavy users of spreadsheets, but SFA systems have made strong inroads and the ability to combine those sources quickly for sales management is critical.

Data Governance: Thinking ahead

One area where Qlik seems to excel is in thinking about the issues of data governance. Even in this early version of Qlik Sense they’ve included some powerful ways of controlling access, both from administration and a business user standpoints. I’ve seen other vendors talk about it and only some of them willing to show if questioned. Josh and Donald brought it up as part of their basic presentation and showed a nice interface.

Just as with the growth of PCs giving individuals power while hurting data governance, BI needs to get ahold of those issue and help the end user and IT work together to manage corporate data to follow business and legislative polices. Qlik’s focus on that is an important differentiator.

Summary

Qlik Sense is a new product. It has very good visualization, which should be expected from Qlik, and has moved forward to an improved UI for ease of use. While they still have issues of concern with data access, their data governance implementation seems to be ahead of the curve and is well thought out. It’s an early generation product, so it doesn’t bother me that it has some holes. The critical thing is to look at the products in the perspective of your timeframe of needs and see if it’s right for you.

Just as importantly, from my marketing perspective, is the maturation of the marketing message and team. I’m hearing multiple voices speaking the same message. On the product and corporate fronts, Qlik is moving ahead in a good direction.

SiSense at the BBBT: High Performance BI at Low Cost?

The latest presentation at the BBBT was by Amit Bendov, CEO, Sisense. First marketing warning: If you’re going to their web site, be prepared. Maybe it’s only for some weird Halloween thing, but the yellow and black background of the web site is the one of the ugliest thing I’ve seen for a professional company. However, let’s look under the covers, because it gets better.

The company was founded in 2004 and Amit says the first sales were in 2010. There’s a good reason for that delay. They are yet another young company who talks about being a full stack BI provider, being more than a visualization tool but also supposedly providing ETL, data storage and the full flow for your information supply chain from source systems to display. That technology took a while to develop.

Technology: Better integration of memory and Disk

The heart of their system is a patent pending technology that tightly integrates cpu cache, RAM and disk to better leverage all storage methods for higher performance. The opportunities that theory provides are enough that they’ve received $50 million (USD) in venture funding, $30 million in their latest round, earlier this year.

As they are a startup, it’s no surprise that the case studies given were for SMB or departments within enterprises. That’s the normal pattern, where a smaller group takes advantage of flexibility to try new products to solve focused problems. As their customer list includes companies such as Ebay, Wix, ESPN and Merck, companies with lots of data, those early entrants increase the potential if Sisense continues to perform.

Another key technology component is their columnar database. They created a proprietary one to be able to support their management technology. That’s completely understandable as their database isn’t purely on disk or memory, but in a combined mix that needs special database management.

The final key to their technology is that they worked to ensure the software runs on commodity chips from the X86 heritage. That means it runs on normal, affordable, off the shelf servers, not on high priced appliances.Sisense hardware price comparison

The combination of the speed and affordability of the technology is justification for the rounds of funding they’ve received.

Really full stack?

One fuzziness that I’ve mentioned with other full stack vendors is the ETL side of the process. The growth of Cloud companies such as Salesforce, and the accessibility of their APIs, means that you can get a lot of information out of systems aimed at SMB. However, true enterprise ETL means accessing a very wide variety of systems with much less easy or open APIs. When Mr. Bendov talked about multiple systems, it seems, from presentation and demo, that he’s talking about multiple instances of simple databases or open APIs, and not a breadth of source types. There wasn’t a lot of choice in the connection section of his application.

That’s not a problem for companies at Sisense’s state of maturity, as long as there’s a business plan to expand to more enterprise sources. They need to focus on proving the technology in the short term and having more heterogeneous access in their tool bag for the future.

Another issue is the question of what, exactly, their database is. Amit Bendov made a brief comment about not needed data warehouse, but as I and others quickly brought up, there are two problems with that statement. First, they would seem to be a data warehouse. They’re extracting information from source systems, transforming that information even if not into the old star-schema structures, and providing the aggregate information for analysis. Isn’t that a high level description of a warehouse? Second, as they’re young and focused on SMB or departments, as with other companies who serve visualization, they might need to look at customer demands and get access to corporate data warehouses as another source.

The old definition of a federated data warehouse seems to be evolving into today’s environment where sometimes an EDW is a source, other times a result and sometimes it’s made up of multiple accessible components such as Sisense and other databases. Younger companies who disparage EDWs need to be careful if they wish to address the enterprise market. The EDW is evolving, not dying off.

User interface and more

One of my first trips to Israel was, in part, when my boss and I had to bring a couple of UI specialists to show Mercury Interactive’s programmers why it might be nice to rethink application interfaces. It’s wonderful what twenty years have wrought. Amit Bendov says that Sisense has one UI specialist for every two programmers, and the user interface shows that. While I mentioned that they need broader ETL access, the simplicity of getting to sources is clear. While you still will need a business analyst to understand some column names, it’s a very easy to use interface.

The same is true in the visualization portions of their application. While it’s still a simpler tool, it has all the basics and is very clear to understand and use.

Paving the way for their spread into enterprise, the Sisense team also supports single-sign on, basic data access control, both in global administration and in the user interface, and other things that will be needed to convince a larger corporation to spread the technology.

Summary

Sisense looks like a startup in a great position. Their technology is well thought out and seems to be performing very well in the early stages. Affordable, fast, business intelligence is something nobody will turn down.

The challenge is two-fold:

  • Do they have the technology plans to help them address larger enterprise issues?
  • Do they have the mindset to understand the importance not only in marketing, but in changing the marketing to a more business focus?

This is the same refrain you’ve heard from me before and which you’ll hear again. This is the Chasm challenge. Their technology has a great start, but their web site and presentation show they aren’t yet thinking bigger and we’ll have to see what the future holds both for the technology and the messaging.

Business intelligence is a very visible market and one growing quickly. While small companies need to focus on the early adopters, they must very rapidly learn how to address the enterprise, both in products and marketing.

High performance BI at a reasonable cost is a great sell, but Sisense isn’t yet read for full enterprise. Sisense has a great start but life is fluid.

DataHero at the BBBT: A Startup Getting It Right

First, on a tangent not directly focused on the product: Thank you Chris Neumann, CEO or DataHero. After hearing presenters from multiple companies consistently use the wrong words over the last few months, you used both premise and premises in the appropriate places. Thanks!

As you might gather, Wednesday’s presentation at the BBBT was by DataHero. A fairly young company, less than three years old, DataHero is focused on “Delivering a self-service Cloud BI solution that enables enterprise and SMB users to analyze and visualize their SAAS-based data without IT.”

Self-service BI is what almost all the players, both new and mature companies, are trying to provide these days. This just means they’re another player in attempting to help business knowledge workers to connect to data, analyze it and gather useful and actionable information without heavy intervention by business analysts and IT.

Cloud is also where everyone’s moving since it has so many advantages to all areas of software. DataHero, as a small company, isn’t just in the Cloud. They’ve smartly decided to begin by focusing on public Cloud applications with accessible API’s.

While that initially simplifies things, the necessity to handle complexity still exists in that world. Mike Ferguson, another BBT member analyst, pointed out that many of his clients have multiple, customized Salesforce.com instances and that’s bringing the upgrade issues seen in on-premises systems into the Cloud world. Chris acknowledges that and understands the need to grow to handle the issue, but knows that at the current size of DataHero there’s enough of a market for an initially more focused solution.

A strategic issue comes up with the basic nature of the Cloud. Mr. Neumann mentioned Cloud being opposed to centralized data, but that’s not quite so. Depending on how Cloud systems are set up, they can help or hinder centralization of data. However, right now he is accurate in that most of the growth of Cloud is departmental in nature. It’s also further blurring the always fuzzy line between enterprise and SMB markets by providing applications that both groups can leverage.

Another area that shows thought in their growth strategy is entry into new market. Chris is clear that they dip their toes into an arena, check reactions, and if positive then try to partner with as many companies in the space as possible to maintain neutrality. That means they don’t get locked into the first vendor the first client wants to work with, regardless of market control, leaving flexibility for customers. Their partner page, though young, clearly shows that strategy in effect. That’s a good move and I wish more vendors would think that way.

Another key growth issue is data cleansing. Right now, DataHero does none, expecting that the source system provides that capability. However, as clients use more and more source systems, there’s a cleansing need to clarify data clashes from different systems. That’s something the team at DataHero says they’re aware of while, again, that’s future growth (no time frames, as per legal sanity…).

The demo was very interesting. The other founder, Jeff Zabel, has a strong history in designing interfaces for software in vehicles, meaning usability really matters. That can be seen with a very clear and simple interface. It is easy to use. However, as pointed out by many other companies, 80% of business data has a location component and many DataHero vendors are far ahead of them in the area of geospatial information. That’s a key area they’ll have to improve.

Summary

DataHero is a young company with a young product. The key is that they aren’t just looking at their cool product and customizing solely on first sales. They have thought through a clear growth strategy. The BI tool is clearly fully fledged for the market segment they’ve chosen for initial release and they have thought through their growth strategy in far more detail than I’ve seen in other vendors who have presented at the BBBT.

If they execute their vision, and I see no reason why they wouldn’t, the folks at DataHero have a bright future.

Splunk at BBBT: Messages Need to Evolve Too

Our presenters last Friday at the BBBT were Brett Sheppard and Manish Jiandani from Splunk. The company was founded on understanding machine data and the presentation was full of that phrase and focus. However, machine data has a specific meaning and that’s not what Splunk does today. They speak about operational intelligence but the message needs to bubble up and take over.

Splunk has been public since 2012 and has over 1200 employees, something not many people realize. They were founded in 2004 to address the growing amount of machine data and the main goal the presenters showed is to “Make machine data accessible, usable and valuable to everyone.”

However, their presentation focused on Splunk’s ability to access IVR (Interactive Voice Recorder) and twitter transcripts and that’s not machine data. When questioned, they pointed out that they don’t do semantic analysis but focus on the timestamp and other machine generated data to understand operational flow. Still, while you might stretch and call that machine data, they did display doing some very simple analytics on the occurrence of keywords in text and that’s not it.

It’s clear that Splunk has successfully moved past pure machine data into a more robust operational intelligence solution. However, being techies from the Bay Area, it seems they still have their focus on the technology and its origins. They’re now pulling information from sources other than just machines, but are primarily analyzing the context of that information. As Suzanne Hoffman (@revenuemaven), another BBBT member analyst, pointed out during the presentation, they’re focused on the metadata associated with operational data and how to use that metadata to better understand operational processes.

Their demo was typical, nothing great but all the pieces there. The visualizations are simple and clear while they claim to be accessible to BI vendors for better analytics. However, note that they have a proprietary database and provide access through ODBC and an API. Mileage may vary.

There was also a confusing message in the claim that they’re not optimized for structured data. Machine data is structured. While it often doesn’t have clear field boundaries, there’s a clear structure and simple parsing lets you know what the fields and data are in the stream. What they really mean is it’s not optimal for RDBMS data. They suggest that you integrate Splunk and relational data downstream via a BI tool. That makes sense, but again they need to clarify and expose that information in a better way.

And then there’s the messaging nit. While talking about business as my main focus, technology presented with the incorrect words jars the educated audience. Splunk is not the first company nor will it, sadly, be the last, to have people who are confused about the difference between “premise” and “premises.” However, usually it’s only one person in a presentation. The slides and both presenters showed a corporate confusion that leads me to the premise that they’re not aware of how to properly present the difference between Cloud and on-premises solutions.

Hunk: On the Hadoop Bandwagon

Another messaging issue was the repeated mention of Hunk without an explanation. Only later in the presentation, they focused on it. Hunk’s their product to put the Splunk Enterprise technology on a Hadoop database. Let me be clear, it’s not just accessing Hadoop information for analysis but moving the storage from their proprietary system to Hadoop.

This is a smart move and helps address those customers who are heavily invested in Hadoop and, at least at the presentation level, they have a strong message about having the same functionality as in their core product, just residing on a different technology.

Note that this is not just helping the customer, it helps Splunk scale their own database in order to reach a wider range of customers. It’s a smart business move.

Security, Call Centers and Changing the Focus

The focus of their business message and a large group of customer slides is, no surprise, on network security and call center performance. The ability to look at the large amount of data and provide analysis of security anomalies means that Splunk is in the Gartner Magic Quadrant for SIEM (Security Information and Event Management).

In addition, IVR was mentioned earlier. That combined with other call center data allows Splunk to provide information that helps companies better understand and improve call center effectiveness. It’s a nice bridge from pure machine data to a more full featured data analysis.

That difference was shown by what I thought was the most enlightening customer slide, one about Tesco. For my primarily US readers, Tesco is a major grocery chain, with divisions focused on everything from the corner market to supermarkets. They are headquartered in England, are the major player in Europe and the second largest retailer by profit after Walmart.

As described, Tesco began using Splunk to analyze network and website performance, focused on the purely machine data concerns for performance. As they saw the benefit of the product to more areas, they expanded to customer revenue, online shopping cart data and other higher level business functions for analysis and improvement.

Summary

Splunk is a robust and growing company focused on providing operational intelligence. Unfortunately, their messaging is lagging their business. They still focus on machine data as the core message because that was their technical and business focus in the last decade. I have no doubts that they’ll keep growing, but a better clarification of their strategy, priorities and messages will help a wider market more quickly understand their benefits.

Datawatch at BBBT: Another contender and another question of message

Yesterday’s presentation to the BBBT was by Datawatch personnel Ben Plummer, CMO, and Jon Pilkington, VP Products. As they readily admit, they’re a company with a long history about which most people in the industry have never heard. They were founded in the 1980s and went public in the 1990s. Their focus is data visualization, but much of their business has been reseller and OEM agreements with companies including SAP, IBM and Tibco.

The core of their past success was with basic presentation of flat file information through their Monarch product. It was only with the acquisition of and initial integration with Panopticon in 2013, providing access to far more unstructured data that they rebranded as data visualization and began to push strongly into the BI space.

The demo was very standard. Everyone wants to show their design interface and how easy it is to build dashboards. Their demonstration was in the middle of the pack. The issue I had was the messaging. It’s no surprise that everyone claiming to be a visualization company needs to show visualization, but if you’re not one of the very flashy companies, your message about building your visualization should be different.

Datawatch’s strengths seem to be two-fold:

  • Access a very wide variety of data sources.
  • Access in motion data.
  • Full service from data access to presentation.

While Ben’s presentation talked about the importance of the Internet of Things and that real-time data is transactional, Jon’s presentation didn’t support those points. Datawatch is another company working to integrate structured and non-structured data and they seem to have a good focus on real-time, those need to be messages throughout their marketing, and that means in the demo.

Back from that tangent to the mainline. The third point is a major key. Major ETL and data warehouse vendors aren’t going away, but for basic BI, it adds costs and time to have to look at both and ETL and a data visualization tool which may not work together as the demoware indicates (A surprise, I know…). The companies who can get the full stream data supply chain from source to visualization can much more quickly and affordably add value for the business managers wanted better BI. I know it’s a fine line in messaging that and still working with vendors who overlap somewhat, but that’s why Coopetition was coined.

They seem to have a good vision but they haven’t worked to create a consistent and differentiated message. That could be because of resources and hopefully that will change. In February of this year Datawatch issued a common stock offering that netted them more cash. Hopefully some of that will be spent to focus on created strong and consistent marketing. That also includes such simple things as changing press releases to be visible from the PR link as html, not just pdfs.

Summary

I know you’re getting tired of hearing the following refrain, but here it is again. The issue is that I’ve heard this message before. The market is getting crowded with companies trying to support modern BI that’s a blend of structured and unstructured data. Technologists love to tweak products and think that minor, or even major technical issues that aren’t visibly relevant to the market should sell the product all by themselves. Just throw some key market points on top of them and claim you have no competitors because your technology is so cool.

BI and big data are cool right now and there are a large number of firms attempting to fill a need. Datawatch seems to have the foundations for a good, integrated platform from heterogeneous data access to visual presentation of actionable information. That message needs to quickly become stronger and clearer. This is a race. Being in shape isn’t enough, you have to have the right strategy and tactics to win the race. Datawatch has a chance, will they stumble or end up on the podium?

NuoDB at the BBBT: Another One Bringing SQL to the Cloud

Today’s presentation in front of the BBBT was by NuoDB’s CTO, Seth Proctor. NuoDB is a small company with big investments. What makes them so interesting? It’s the same thing as in many of the other platform presenters at the BBBT. How do we get real databases in the Cloud?

Hadoop is an interesting experiment and has clearly brought value to the understanding of massive amounts of unstructured data. The main value, though, remains that it’s cheap. The lack of SQL means it’s ok for point solutions that don’t stress its performance limitations. Bringing enterprise database support to the cloud is something else.

The main limitation is that Hadoop and other unstructured databases aren’t able to handle transactional systems while those still remain the major driver in operating businesses.

NuoDB has redesigned the database from the ground up to be able to run distributed across the internet. They’ve created a peer-to-peer structure of processes, with separate processes to manage the database and SQL front end transaction issues.

Seth pointed out that they ““Have done nothing new, just things we know put together in a new way.” He also pointed out they have patents. My gripe about patents for software is an issue for another day, but that dichotomous pairing points to one reason (Apple’s patent on a rounded rectangle is another example of the broken patent system, but off the soap box and onwards…).

It’s clear that old line RDMS systems were designed on major, on-premise servers. The need for a distributed system is clear and NuoDB is on the forefront of creating that. One intriguing potential strength, one about which there wasn’t time to discuss in the presentation, is a statement about the object-oriented structure needed for truly distributed applications.

Mr. Proctor stated that the database schema is in object definitions, not hard coded into the database. He added that provides more flexibility on the fly. What it also could mean is that the schema isn’t restricted to purely RDBMS schemas and that future versions of their database could support columnar and even unstructured database support. For now, however, the basic ability to change even a standard row-based relational database on the fly without major impacts on performance or existing applications is a strong benefit.

As the company is young and focused on the distributed aspects of performance, it was also admitted that their system isn’t one for big data, even structures. They’re not ready for terabytes, not to mention petabytes of data.

The Business

That’s the techie side, but what about business?

The company is focused on providing support for distributed operational systems. As such, Seth made clear they haven’t looked at implementations supporting both operational and analytical systems. That means BI is not a focus and so the product might not be the right system for providing high level business insight.

In addition, while I asked about markets I mainly got an answer about Web sites. They seem to think the major market isn’t Global 1000 businesses looking for link distributed operational systems but that Web commerce sites are their sweet spot. One example referred to a few times was in transactional systems for businesses selling across a country or around the world. If that’s the focus, it’s one that needs to be made more explicit on their web site, which really doesn’t discuss markets in the least.

It’s also an entry into the larger financial markets space. It and medical have always been two key verticals for new database technologies due to the volumes of information. That also means they need to prioritize the admitted lack of large database support or they’ll hit walls above the SMB market.

The one business thing the bothers me is their pricing model. It’s based on the number of hosts. As the product is based on processes, there’s no set number of processes per host. In addition, they mentioned shared hosting, places such as AWS, where hosts may be shared by multiple of NuoDB’s customers or where load balancing might take your processes and have them on one host one day and multiple hosts the next.

Host base pricing seems to be a remnant of on-premises database systems that Cloud vendors claim to be leaving. In a distributed, internet based setup, who cares how big the host is, where the host is, or anything else about the host? The work the customer cares about is done by the processes, the objects containing the knowledge and expertise of NuoDB, not the servers owned by the hosting firm. I would expect that Cloud companies would move from processors to process.

Summary

NuoDB is a company focused on reinventing the SQL database for the Cloud. They have significant investment from the VC and business markets. However, it would be foolish to think that Oracle, IBM and other existing mainstream RDBMS vendors aren’t working on the same thing. What NuoDB described to the BBBT used most of the right words from the technology front and they’re ramping up their development based on the investments, but it’s too early to say if they understand their own products and markets enough to build a presence for the long term.

They have what looks like very interesting technology but, as I keep repeating in review after review, we know that’s not enough.