Teradata Aster at the BBBT. Is a technology message sufficient?

Last Friday’s visitors to the BBBT were from Teradata Aster. As you’ve noticed, I tend to focus on the business aspects of BI. Because of that, this blog entry will be a bit shorter than usual.

That’s because the Teradata Aster folks reminded me strongly of my old days before I moved to the dark side: They were very technical. The presenters were Chris Twogood, VP, Product and Services Marketing, and Dan Graham, Technical Marketing.

Chris began with a short presentation about Aster. As far as it got into marketing was pointing to the real problem concerning the proliferation of analytic tools and that, as with all platform products, Aster is an attempt to find a way to address a way to better integrate a heterogeneous marketplace.

As with others who have presented to the BBBT, Chris Twogood also pointed out the R and other open source solutions aren’t any more sufficient for a full BI solution managing big data and analytics that are pure RDBMS solutions, so that a platform has to work with the old and the new.

The presentation was then handed over to Dan Graham, that rare combination of a very technical person who can speak clearly to a mixed level audience. His first point was a continuation of Chris’, speaking to the need integrate SQL and Map Reduce technologies. In support of that, he showed a SQL statement he said could be managed by business analysts, not the magical data scientist. There will have to be some training for business analysts, but that’s always the case in a fast moving industry such as ours.

Most of the rest of the presentation was about his love of graphing. BI is focused on providing more visual reporting of highly complex information, so it wasn’t anything new. Still, what he showed Teradata focusing upon is good and his enthusiasm made it an enjoyable presentation even if it was more technical than I prefer. It also didn’t hurt that the examples were primarily focused on marketing issues.

The one about which I will take issue is the wall he tried to set between graph databases and the graph routines Aster is leveraging. He claimed they’re not really competing with graph databases which was, Dan posited, because they are somehow different.

I pointed out that whether graphs are created in a database, in routines layered on top of SQL or in Java, or were part of a BI vendor’s client tools only mattered in a performance standpoint, that they were all providing graphical representations to the business customer. That means they all compete in the same market. Technical distinctions do not make for business market distinction other than as technical components of cost and performance that impact the organization. There wasn’t a clear response that showed they were thinking at a higher level than technological differences.

Summary

Teradata has a long and storied history with large data. They are a respected company. The question is whether or not they’re going to adapt to the new environments facing companies with the explosion of data that’s primarily non-structured and having a marketing focus. Will they be able to either compete or partner with newer companies in the space.

Teradata is a company who has long focused on large data, high performance database solutions. They seem to clearly be on the right path with their technology and the implications are that they are in their strategic and marketing focus. They built their name focused on large databases for the few companies that really needed their solutions. Technology came first and marketing was almost totally technically focused on the people who understood the issue.

The proliferation of customer service and Web data mean that the BI market is addressing a much wider audience for solutions managing large amounts of data. I trust that Teradata will build good technology, but will they realize that marketing has to become more prominent to address a much larger and less technical audience? Only time will tell.

TDWI Webinar Review: Business- Driven Analytics. Where’s business?

Today’s TDWI webinar was an overview of their latest best practices report. The intriguing thing was the numbers show that BI & Analytics still aren’t business driven. As Dave Stodder, Director of Research for Business Intelligence, pointed out, there are two key items contradicting that. First, more than half of companies have BI in less than 30% of the organization, pointing out that a large number of businesses aren’t prioritizing BI. Second, most of the responses to questions about BI show that it’s still something controlled and pushed by IT.

One point Dave mentioned was still the overwhelming presence of spreadsheets. They aren’t going away soon. A few vendors who have presented at the BBBT have also pointed out their focus at integrating spreadsheets rather than ignoring all the data that resides in them or demanding everything be collected in a data repository. The sooner more vendors realize they need to work with the existing business infrastructure rather than fight against it, the better off the industry will be.

Another interesting point was the influence of the CMO. I regularly read about analysts and others talking about how the “CMO has a bigger IT budget than the CIO!” The numbers from the TDWI survey don’t bear that out. One slide, a set of tables representing different CxO level positions’ involvement in different areas of the IT buying process show the CMO up near the CIO for identifying the need, but far behind in every other category – categories that include “allocate budget” and “approve budget.” In tech firms, and especially in Silicon Valley, people look around at other firms involved in the internet and forget they’re a small subset of the overall market.

Another intriguing point was brought out in the survey. Of companies with Centers of Excellence or similar groups to expand business intelligence, the list of titles involved in those groups shows an almost complete dearth of business users. It seems that IT still thinks of BI as a cool toy they can provide to users, not something that business users need to be involved in to ensure the right things are being offered. Only 15% show line of business management involved while a pathetic 4% show marketing’s involvement.

The last major point I’ll discuss is an interesting but flawed question/answer table. The question was on how the business-side leadership is doing during different aspects of a BI project. The numbers aren’t good. However, as we’ve just discussed, business isn’t included as much as they should be. There are two things that make me consider:

  • What would the pair of charts look like if the chart was split to look at how IT and business respondents each look at the question?
  • Is it an issue of IT not involving business or business not getting involved when opportunities are presented?

Summary

TDWI’s overview of the current state of business-driven BI & analytics seems to show that there’s a clear demand from the business community but there doesn’t seem to be the business involvement need to finish the widespread expansion of BI into most enterprises.

What I’d like to see TDWI focus on next is the barriers to that spread, the things that both IT and business see as inhibitors to expanding the role of modern BI tools in the business manager’s and CxO suite’s daily decision making.

It’s a good report, but only as a descriptive analysis of current state. It doesn’t provide enough information to help with prescriptive action.

TDWI & Actuate Webinar: Predictive Analytics for the Business Analyst

Today’s TDWI webinar was a good one. Fern Halper is one of the few analyst who managed to speak to the points that are relevant to the sponsor, so that the full presentation works. Today was no different. Today’s sponsor presenter was Allen Bonde, VP, Product Marketing and Innovation, Actuate, and they made a good team.

The presentation was a good overview of the basics of predictive analytics. It started with an overly complex but accurate description by Fern as predictive analytics being “A statistical or data mining solution consisting of algorithms and techniques that can be used on both structured and unstructured data to determine outcomes.” Data mining is an overused term that still manages to be too limiting, it doesn’t belong there. Neither does the description of different types of data as some predictive analytics, such as in operations, don’t necessarily need unstructured data. I’d just say it’s the analysis required to increase the ability to better understand the likelihood of near term outcomes.

What I really liked about both presenters is that they tended to use customer facing examples from sales, marketing and customer support to discuss predictive analytics. Too often, the focus is on operations, either at a detailed process level or a very high business review at the CxO level. The fact that there are many more applications that impact a larger body of business users is good for the market to see.

One thing that I think Ms. Halper didn’t quite think through was how decision trees, her favorite tool (at least as per this presentation) for predictive analytics. While she did briefly mention that there’s overlap between predictive analytics and other types of analysis, I’m not in agreement that trees fit the description – especially as her main example was customer support. In that arena, and others such as financial approvals, decision trees have long been used for call scripts and process flow. They aren’t used to make predictions but to help the service folks make decisions regardless of the outcome at each step. I’d like to hear more about how she thinks they tie into the other predictive tools she mentioned.

Another key point Fran made was how the new technology means that the tools are available closer to the actual business knowledge worker, with applications becoming useable by business analysts, not just statisticians. The numbers from TDWI Best Practices Report Predictive Analytics for Business Advantage, 2014 were interesting.

TDWI Harper predictive analytics tool use

TDWI Best Practices Report Predictive Analytics for Business Advantage, 2014

It was humorous, but no surprise, that the first question in Q&A was from somebody who probably defines him or herself as a “data scientist.” There was umbrage at mere business people being able to work with predictive analytics packages. Halper tried to allay the fears that the money spent on an MS might become useless by pointing out that detailed math is needed to create the processes and some understanding of the analytics is still needed to intelligently use the results of the analysis.

Still, I expect the numbers for analysts and “other business users” to grow in the near future while the statistician is more properly used to build the algorithms and think of new tools that can then be provided to the knowledge worker through modern tools.

Allen Bonde’s section of the presentation, unlike others who have been too technical or salesy, was too high level and didn’t differentiate enough from Fran Halper’s. While we want to see companies positioning themselves as thought leaders push concepts, they are the sponsor and need to tie the thoughts back to their business.

Let’s start with Actuate’s tagline: The BIRT company. What’s BIRT? A bleeding edge audience will know, but TDWI has a wide audience of knowledge in its audience, from open source to large enterprise, still almost exclusively proprietary software shops. He needed to give just a couple of sentences but he didn’t.

The Business Intelligence Reporting Tools (BIRT) project is an attempt to create an open source BI interface. It was started by Actuate who turned it over to a foundation to drive open development. As usual, for most of us, refer to the appropriate Wikipedia article for a brief overview.

His main point was, as he put it, “Fast is the new big.” Though the concept isn’t new and is rather just a return from the focus on misnamed Big Data to how to take whatever data we have, regardless of size, and shorten the time to analyze and provide that information to decision makers. Most of the rest of what he said also wasn’t new to people who have been in the BI industry for a while, but it was at a good level to reinforce Halper’s point to an audience who is just getting familiar with the state of the market.

Summary

Fran Halper and Allen Bonde gave a good, high level overview of some of the key points about predictive analytics. I had nits with some issue and think that a little more meat might help, but I suggest going to TDWI to view the webinar if you are someone who wants to know the basic issues needed to start using predictive analytics as part of a robust BI solution to help business make better decisions.

@fhalper, @abonde, @tdwi, @actuate

EXASOL at the BBBT: Big Data, fast database. Didn’t I just hear this?

Friday’s EXASOL presentation to the BBBT brought a strong feeling of déjà vu. I’ve already blogged about the Tuesday Actian presentation and, to be honest, there were technical differences but I came to the same conclusion about the business model. But first, a thanks to Microsoft for the autocorrect feature. Otherwise typing EXASOL in all caps each time would have been bothersome.

The EXASOL presenters were Aaron Auld (@AaronAuldDE), CEO, and Kevin Cox (@KJCox), Director Sales and Marketing.

I mentioned technical differences. First, and foremost, they didn’t start with hardware but with an initial algorithm for massively parallel processing (MPP). They figured it was a great way to speed up database performance and stuck with columnar oriented relational technology. That’s allowed them to work on multi-terabyte systems with fast performance.

They have published some great TPC-H benchmark numbers, often being two orders of magnitude better than the competitors. While admitting that TPC stats are questionable since they’ve been defined by the big vendors to benefit their performance, often don’t reflect real life queries and often don’t use typical hardware, the numbers were still impressive. In addition, it was a smart business move as a small company blowing away the big vendors’ benchmarks helps elevate visibility and get them into doors.

However, let’s look back at Actian. They also talked about TPC, but they used the TPC-DS benchmark. How do you compare? Well, you can’t.

One other TPC factoid is, just like their competitor, there’s no clear information on true multi-user performance in today’s mobile age. No large numbers of connected clients was mentioned.

So results are great, but how do they fight the Hadoop bandwagon? They understand that open source is cheaper from a license standpoint, but also point out their performance saves in direct comparison when you total all costs for an implementation. People forget that while hardware prices have dropped, servers aren’t free.

Unfortunately, from a business model, it looks like they’re making the typical startup mistake of focusing on their product rather than business needs. They understand that ROI matters, but it seems to be too far down the list in their corporate messaging.

Another major advantage they have in common with the previous presenters is the sticking with SQL involves an easier build of ecosystem to include the existing vendors from ETL through visualization. However, they seem to be a bit further behind the curve in building those partnerships. While they have a strong strategic understanding of that, they need to bubble it up the priority list.

Exasol platform offering

One critical business success they have is their inclusion in the Dell Founders Club 50. That means advice and cooperation from Dell to help improve their performance and expand their presence. For a small company to have access not only to Dell at the technical level but also to bring customers to Dell Solution Centers for demonstrations is a great thing.

While they have been focused on MPP and large customers, the industry move to the Cloud also means they are looking at smaller licensing including a potential one-node free trial.

However, as mentioned in the lead, they seem to have the same business model issue as their competitors: They’re focused on the bleeding edge market who think the main message is performance. While they know there are other aspects to the buying decision, they went back, again and again, to performance. They have the whole picture in mind, but they’re not yet thinking of the mass market.

Organizations such as TDWI, Gartner and Forrester have all reported the high percentage of organizations that are considering big data and how to get a handle on the vast volume of information coming from heterogeneous sources. There’s clearly demand building up behind the dam. The problem seems to be they’re trying, as major IT organizations always do, to understand how best to integrate new technologies and capabilities with as little pain as possible. Meanwhile, the vendors seem to still be focused on the early adopters with their messaging. That leaves dollars on the table and slows adoption of new technology.

Summary

EXASOL seems to have a strongly performing and highly scalable database technology to work with large data sets. Yet, like many companies in the business intelligence space it comes back to audience. Are they still aiming at early adopters or will they focus on the mass market?

Have BI and big data advanced to the point where people need to think about the chasm and how to better address business needs not just technical issues. I think so, and I hope they adjust their business focus.

The company seems to have great potential, but will they turn that into reality? As the great Yogi Berra said, “It’s like deja vu all over again.”

Actian at the BBBT: Hadoop Big Data for the Enterprise Mass Market?

In the mid-90s, Sybase rolled out its new database. It was a great leap forward in performance and they pushed it like crazy. Sybase’s claims were justified, but it was a new way to look at databases and Sybase loudly announced how different it was from what people were used to using. Oops. They sold almost none of it and hit a financial wall and they never quite recovered.

That came to mind during yesterday’s BBBT presentation by Actian. Their technology foundation goes back to Ingres and that means they’ve been in the database market a long time. The question is whether or not they’ve learned from past case studies.

The presenters were John Santaferraro, VP of Solution and Product Marketing, and Emma McGrattan, SVP Engineering. They gave a great technical overview of Actian’s offerings. Put simply, they’re providing a platform for Big Data access. At the core is Hadoop, but they’ve taken their deep understanding of RDBMS technology and incorporated SQL access. That clearly opens up two things:

  • Better access to partners for ETL and analytics
  • The ability for the mass of business analysts to get at Hadoop data to more easily perform their jobs.

That’s a great thing and I’ll discuss later whether they’re taking that technology to the right markets. Before that, however, I should point out the main competitive point they repeatedly hit on. TPC benchmarks are public, so they went out and compared themselves to who they consider, rightly, to be their main competition: Cloudera Impala. Their results are seen in the chart below.

Actian performance comparison

Actian’s TPC-DS comparison with Cloudera Impala

 

They returned to this time and time again. On the other hand, they discussed the full platform intelligently but only briefly.

They also covered more of the technology, and there’s a lot of it. As a Computer Associates company, they grow by acquisition. It’s not just a renamed Ingres, but has acquired, VectorWise, Versant, Pervasive and ParAcell. Many companies have had trouble acquiring and integrating firms, but the initial descriptions seem to be showing a consolidated platform.

One caveat: We had no demo. The explanation was the Hadoop Summit demo went so well that they’re in the middle of moving it to a new server and IT didn’t give a heads up. Believable, but again I personally am not too worried. As a former field guy, I know how little emphasis to put into a short demo.

So what did I think was the key technology, if not performance? That’s next.

Hadoop meets SQL

To folks focused on the largest data sets and others, as in car ownership, who like speed for the pure sake of it, the performance is impressive. To me, that’s not the key. Rather, it’s the ability to bridge the Hadoop-SQL divide. As John Santaferraro pointed out, orders of magnitude more business analysts and business users know SQL than know MapReduce and the related underpinnings of Hadoop.

Actian Hadoop platform for big data

Actian platform

While other Big Data companies have been building bridges to ETL, data cleansing, analytics and other tools in the ecosystem, custom work to do that is time consuming. Opening the ability to use standard, existing SQL tools means you can more quickly build a stronger ecosystem.

Why does that matter?

What is the market

During the presentation, the Actian team was asked about their sweet spot. Is it folks already playing with Hadoop who want better access to enterprise data or is it companies who’ve heard about Hadoop but haven’t stepped in yet to try because of all the questions. Their answer was the first group. I think that’s wrong, however, I understand why they are

Another statement from John was that they are in Silicon Valley and everyone there thinks everyone uses Hadoop because everyone there does. He admitted that’s not true out of the small region. However, sometimes it’s hard to fight the difference between what you intellectually know and what you’re used to. I’ve seen it in multiple companies, and I think it’s happening here.

The mass of global businesses haven’t yet touched Hadoop. It’s very different from what the typically overburdened and underfunded IT organization does, and that much change is scary. Silicon Valley is full of early adopters, it attracts them. In addition, there are plenty of early adopters out there for the picking. However, there are now a lot of vendors in the BI and big data spaces and we’re getting close to a tipping point. The company that figures out how to cross the chasm first is the one who will make it big.

It’s not pure performance that will attract the mass market, it’s how to get the advantages of big data in the most affordable way with the easiest transition path. It’s the ability to quickly leverage existing IT infrastructure and to join it with the newest technology.

Once again, it’s evolution rather than revolution that will win the day.

Summary

From what I saw of the platform, it’s a great start. The issue I see is the focus on the wrong market. The technology will always be important, but though it’s critical it only exists to solve the business problems. Actian seems to have a good handle on the technology and are on a path to integrate and leverage all the acquisitions into a solid platform, but will they be able to explain why that matters to the right market?

There is hope for that. One thing discussed is that their ability to bridge SQL and Hadoop means they are working on building partnerships with major vendors to extend their ecosystem. If they focus on that, they have a great chance of being very successful and being the company that brings Hadoop to the wider IT market.

Twitter: @actiancorp, @santaferraro & @emmakmcgrattan

Salient at the BBBT: The Thin, Blue Suited Line

The problems with the USA’s Veterans Administration are in the news. Much of the scheduling issues have to do with large volumes of modern data being run through decades old systems built by systems integrators (SI). Custom built systems can be the choice during the early stages of a software solution category’s life cycle. However, they are very difficult to upgrade and modernize.

Last Friday’s BBBT presentation was by Salient Management Company. Salient is a consulting group with a product and all the potential hazards of SI’s came to mind. The presenters, David F. Gianneto and Jim McDermott, are both in professional services. The obvious question was how much of their solution is customized versus how much is truly a software solution that provides the ability to upgrade and adapt as needs change.

The Software-Services Balance

Modern business software is complex. Every software firm must have professional services, either internally or through partners, to help with implementation. Many software founders think their software is so wonderful that they don’t need serious professional services. Many professional services companies think every client is so different that software must always be heavily customized. How do technology executives balance the difference issues of ISV software development and the need for consulting? More importantly, for this article, how does Salient management look at that?

From the cursory experience of a three hour presentation, the balance is a strength of Salient. One slide, in particular, pointed to a logical split. They point that the business user is not the person who has to understand the technology. That’s something everyone agrees is true, but not many companies seem to understand how to address.

David Giannetto and Jim McDermott presented a company that claims to focus consulting at ensuring an understanding of the client’s business model, helping to ensure that the implementation does address business needs, while demonstrating a product that looks like a standard interface.

While that was the focus of the talk, the product demo implied other consulting. They did not cover the complexity of ETL, even though questioned, so I’m also assuming there is significant technical professional services needed to link to data sources. That assumption is backed up by the fact that Salient uses a proprietary database that wasn’t discussed in detail.

One critical point about their technology insight is that Salient began with an in-memory architecture thirty years ago. It was a risky choice as most companies thought that the price/performance ratio benefits of disk would grow far more than RAM. The drop in RAM prices and the growth of parallel computing software are providing strong backing for their initial gamble. They have a clear focus on technology and products.

Their offering seems to be a sandwich of services to understand the business and implement the data acquisition on the outside with robust software for the BI users at the heart. I can see a continued strength in the business consulting, but the robustness of some newer vendors as far as simplifying the back end, and thereby lowering those costs and shortening implementation times, is a potential risk to the Salient model.

The Interface

Jim’s demo showed both dashboards that allow management to slice & dice basics themselves and a designing interface with more capabilities for power users and analysts. While they claim that the software changes the roll of IT from control to governance, I still didn’t see anything that allows the end users to integrate new sources. IT is still required for more than just governance.

There was also a very good example of how geospatial information is being integrated into the analysis to better understand demographics and logistics. In the CPG market, that can provide a crucial advantage.

One key point that some competitors might knock is that most of the charts and graphs aren’t as fancy as in some BI tools. However, my response is “so what?” First, they accomplish the same things. Second, focusing on how fancy graphs look sometimes creates overly complex displays that can slow understanding. When we’re dealing with executives who have worked with Microsoft Excel or with Crystal Reports for decades, a way of seeing new analytics clearly and simply, almost in the style they’ve seen can help adoption. The focus is on understanding the information and I thought the simpler graphics had the benefit of a comfort level for managers.

Summary

Overall, I was impressed by Salient. The combination of strong business consulting, a good BI interface and a history in in-memory data management means that they’re well positioned to address Global 1000 firms. Any large organization should evaluate the Salient offering for the combination of product and services.

The risk I see is that the service/software balance is right for large companies, I don’t see them getting into the SMB market anytime soon. While that might not concern them in the short term, one word: Salesforce.com. There are new vendors coming up which are much easier to implement. If they can grab a large chunk of the SMB market, they can then move up the food chain to challenge the large companies, as Salesforce has done in their markets.

I see Salient growing, I’m just not sure if they’ll be able to grow as fast as the market is growing. Depending on their plans, that could be good or bad.

The Myth of the Data Scientist

I’ve been waiting for notification and searching on toolbox.com’s site itself. Silly me, thinking their search would work. Now that I’ve moved across the country and am getting settled in, after a few weeks of “fun,” I did a yahoo search. Found it!

Before Ziff-Davis decided they didn’t want more IT Management track articles from a writing house I worked with, they bought two of my articles. The first is The Myth of the Data Scientist. I’ve muttered about it before and will again, but that’s a longer article. Enjoy.

TDWI and HP Webinar: Modernizing the Data Warehouse

After a couple of mediocre webinars, it was nice to see TDWI get back on track. This week’s seminar was sponsored by HP Vertica and discussed Data Warehousing Modernization. The speakers were Philip Russom, from TDWI, and Steve Sarsfield, Product Marketing Manager, HP Vertica.

Philip led with the five key reasons organizations need to modernize Enterprise Data Warehouses (EDWs):

  • Analytics
  • Scale
  • Speed
  • Productivity
  • Cost Control

He pointed out that TDWI research show the first three to be far more of a key focus for companies than that others. One key point was that cost control should have more of an impact than it does. Mr. Russom pointed out that even if your EDW peforms properly today, much of the new technology is based on open source and less expensive servers, so a rethink of your warehouse can bring clear ROI, as he pointed out with ““Modernization is a great opportunity to rethink economics.”

Another major point was the simple fact, overlooked by many zealots, that EDWs aren’t going anywhere. Sure, there are newer technologies that allow for analytics straight from operational data stores (ODSs) and other places, but there will always be a place for the higher latency accumulation of information that is the EDW.

After that setup, Steve Sarsfield gave the expected sponsor pitch for how HP Vertica helps companies modernize. It’s also good to say that his presentation was better than most. It walked the right line, avoiding the overly-salesy and too technical extremes of many sponsor pitches.

Sarsfield’s main point is that Hadoop is great for ODSs but implementations still haven’t gotten up to speed in joins and other data manipulation capabilities seen in the mature SQL environment. He described HP Vertica as having the following key components:

 TDWI HP Vertica Secret Sauce

I think the only one that needs explanation is the last, Projections. If not, please let me know and I’ll expand on the others. Projections are, simply put, the HP method for replacing indices. Columnar databases don’t provide the index structures that standard RDMS systems based on rows provide.

It was a good overview that should bring HP into the mix for anyone looking to modernize their EDW environment.

The final point that came up during Q&A was about Big Data. It’s one many folks have made, but we know how much you listen to analysts pontificating…

Philip Russom pointed out, as many have, that Big Data isn’t about the size of the data but about managing the complexity of modern data. He did that point pitching the most recent TDWI Best Practices Report, Evolving Data Warehouse Architectures in the  Age of Big Data. What Philip pointed out was that the judges regularly came back with clear opinions that complexity was more important than database size. Very large databases where people were just doing aggregations of columns weren’t interesting. It was the ability to link to multiple sources and provide advanced insight through analytics that the judges felt most reflected the power in the concept of Big Data.

All told, it was a smooth and informative presentation that hopefully helped its IT audience understand a bit more about the issues involved in modern data warehousing. It was time well spent.

GoodData at the BBBT

Today’s BBBT presentation was by GoodData and I’m still waiting. Vendor after vendor tells us that they’re very special because they’re unique when compared to “traditional BI.” They don’t seem to get that the simple response is “so what?” Traditional BI was created decades ago, when offering software in the Cloud was not reasonable. Now it is. Every young vendor has a Cloud presence and I can’t imagine there’s a “traditional” company that isn’t moving to a Cloud offering. BI is not the Cloud. I want to hear why they have a business model that differentiates them from today’s competitions, not from the ones in the 1990s. I’m still waiting.

Almost all the benefits mentioned were not about their platform, they weren’t even about BI. What was mentioned were the benefits that any application get by moving to the Cloud. All the scalability, shareability, upgradability and other Cloud benefits do not a unique buying proposition make. Where they will matter is if GoodData implemented those techniques faster and better in the BI space than the many competitors who exist.

Serial founder, Roman Stanek wants his company to provide a strong platform for BI based on Open Source technology. The presentation, however, didn’t make clear if he really had that. He had the typical NASCAR slide, but only under NDA, with only a single company mentioned as an open reference. His technological vision seems to be good, but it’s too early to say whether or not the major investments he has received will pay off.

What I question is his business model. He and his VP of Marketing, Jeff Morris, mentioned that 2/3 of their revenue comes from OEM agreements, embedding their platform into other applications. However, his focus seems to be on trying to grow the other third, the direct sales to the Fortune 2000. I’m not sure that makes sense.

Another business model issue is that the presenters were convinced that the Cloud means they can provide a single version of product to all customers. They correctly described the headaches of managing multiple versions of on-premises software (even if they avoided saying “on-premise” only a third of the time). However, the reason that exists is because people don’t want to switch from comfortable versions at the speed of the vendor. While the Cloud does allow security and other background fixes to easily update to all customers, any reasonable company will have to provide some form of versioning to allow customers a range of time to convert to major upgrades.

A couple of weeks ago, 1010data went the other direction, clearly admitting that customers prefer that. I didn’t mention that in my blog post on that presentation, even though I thought they went too far in the other direction of too many versions, but combined with GoodData’s thinking there should only be one, now’s as good a time as any to mention that. Good Cloud practices will help minimize the number of versions that need to be active for your customers, but it’s not reasonable to think that will mean a single version.

At the beginning of the presentation, Roman mentioned a company, as a negative reference: Crystal Reports. At this point, I don’t think that comparison is at all negative. Nothing that GoodData showed us led me to believe that they can really get access to the massively heterogeneous data sources in true enterprise business. He also showed nothing that indicates an ability to provide top level analysis and display as required in that market. However, providing OEM partners a quick and easy way to add basic BI functions to their products seems to be a great way to build market share and bring in revenue. While Crystal Reports seems archaic, it was the right product with the right business plan at the right time, and the product became the de facto standard for many years.

The presentation left me wondering. There seems to be a sharp team but there wasn’t enough information to see if vision and product have gelled to create a company that will succeed. The company’s been around since 2008, just officially released the product, yet have a number of very interesting customers. That can’t be based just on the strong reputation of Mr. Stanek, there has to be meat there. How much, though, is open to question based on this presentation. If you’re considering an operational data store in the Cloud, talk with them. If you want more, get them to talk to you more than they talked to us.