Category Archives: Enterprise Software

GoodData at the BBBT

Today’s BBBT presentation was by GoodData and I’m still waiting. Vendor after vendor tells us that they’re very special because they’re unique when compared to “traditional BI.” They don’t seem to get that the simple response is “so what?” Traditional BI was created decades ago, when offering software in the Cloud was not reasonable. Now it is. Every young vendor has a Cloud presence and I can’t imagine there’s a “traditional” company that isn’t moving to a Cloud offering. BI is not the Cloud. I want to hear why they have a business model that differentiates them from today’s competitions, not from the ones in the 1990s. I’m still waiting.

Almost all the benefits mentioned were not about their platform, they weren’t even about BI. What was mentioned were the benefits that any application get by moving to the Cloud. All the scalability, shareability, upgradability and other Cloud benefits do not a unique buying proposition make. Where they will matter is if GoodData implemented those techniques faster and better in the BI space than the many competitors who exist.

Serial founder, Roman Stanek wants his company to provide a strong platform for BI based on Open Source technology. The presentation, however, didn’t make clear if he really had that. He had the typical NASCAR slide, but only under NDA, with only a single company mentioned as an open reference. His technological vision seems to be good, but it’s too early to say whether or not the major investments he has received will pay off.

What I question is his business model. He and his VP of Marketing, Jeff Morris, mentioned that 2/3 of their revenue comes from OEM agreements, embedding their platform into other applications. However, his focus seems to be on trying to grow the other third, the direct sales to the Fortune 2000. I’m not sure that makes sense.

Another business model issue is that the presenters were convinced that the Cloud means they can provide a single version of product to all customers. They correctly described the headaches of managing multiple versions of on-premises software (even if they avoided saying “on-premise” only a third of the time). However, the reason that exists is because people don’t want to switch from comfortable versions at the speed of the vendor. While the Cloud does allow security and other background fixes to easily update to all customers, any reasonable company will have to provide some form of versioning to allow customers a range of time to convert to major upgrades.

A couple of weeks ago, 1010data went the other direction, clearly admitting that customers prefer that. I didn’t mention that in my blog post on that presentation, even though I thought they went too far in the other direction of too many versions, but combined with GoodData’s thinking there should only be one, now’s as good a time as any to mention that. Good Cloud practices will help minimize the number of versions that need to be active for your customers, but it’s not reasonable to think that will mean a single version.

At the beginning of the presentation, Roman mentioned a company, as a negative reference: Crystal Reports. At this point, I don’t think that comparison is at all negative. Nothing that GoodData showed us led me to believe that they can really get access to the massively heterogeneous data sources in true enterprise business. He also showed nothing that indicates an ability to provide top level analysis and display as required in that market. However, providing OEM partners a quick and easy way to add basic BI functions to their products seems to be a great way to build market share and bring in revenue. While Crystal Reports seems archaic, it was the right product with the right business plan at the right time, and the product became the de facto standard for many years.

The presentation left me wondering. There seems to be a sharp team but there wasn’t enough information to see if vision and product have gelled to create a company that will succeed. The company’s been around since 2008, just officially released the product, yet have a number of very interesting customers. That can’t be based just on the strong reputation of Mr. Stanek, there has to be meat there. How much, though, is open to question based on this presentation. If you’re considering an operational data store in the Cloud, talk with them. If you want more, get them to talk to you more than they talked to us.

VisualCue at BBBT: A New Paradigm for Operational Intelligence

The latest presentation to the BBBT was by Kerry Gilger, President and Founder of VisualCue™ Technologies. While I find most of the presentations interesting, this was real eye-opener.

Let’s start with a definition of operational intelligence  (OI): Tools and procedures to better understand ongoing business operations. It is a subset of BI focused on ongoing operations in manufacturing, call centers, logistics and other physical operations where the goal is not just to understand the high level success of processes but to better understand, track and correct individual instantiations of the process.

A spreadsheet with a row of data for each instantiation is a cumbersome way to quickly scan for the status of individual issues. The following image is an example of VisualCue’s solution: A mosaic of tiles that show different KPIs of the call center process, with a tile per operator, color coded for quick understanding of the KPIs.

VisualCue call center mosaic

 The KPIs include items such as call times, number of calls and sales. The team understands each element of the tile and a review shows the status of each operator. Management can quickly drill down into a tile to see specifics and take corrective actions.

The mosaic is a quick way to review all the instantiations of a given process, a new and exciting visualization method in the industry. However, they are a startup and there are issues to watch as they grow.

They have worked closely with each customer to create tiles that meet needs. They are working to make it easier to replicate industry knowledge to help new customers start faster and less expensively.

The product has also moved from full on-site code to a SaaS model to provide shared infrastructure, knowledge and more in the Cloud.

VisualCue understands operational intelligence is part of the BI space, and has begun to work with standard BI vendors to provide integration with other elements that make up a robust dashboard including the mosaic and other informational elements, that’s rightfully in its infancy given the company’s evolutionary stage. If they keep building and expanding the relationships there’s no problem.

However, the thing that must change to make it a full-blown system is really how they access the data. It’s understandable that a startup expects a customer to figure out all its own data access issues and provide a single source database to drive the mosaics, they’re going to have to work more closely with ETL and other vendors to provide a more open access methodology as they grow and a more dynamic, open data definition and access model than “give us a lump of data and we’ll work with it.”

Given where the company is right now, those caveats are more foibles than anything else. They have the time to build out the system and their time has, correctly, been spent in creating the robust visualization paradigm they demonstrated.

If Kerry Gilger and the rest of his team are able to execute the vision he’s shown, VisualCue will add a major advancement in the ability for business management to quickly understand operations in a way that can provide instant feedback that can improve performance.

TDWI Webinar on in-memory data, another miss

I’ve been skipping the last few TDWI webinars, not exactly knowing how to politely criticize some poor ones. However, I feel as I’m performing a disservice to those who read, so I’ll have to discuss today’s.

The title was “In-Memory Computing: Expanding the Platform Horizon Beyond the Database.” The pitch was that in-memory is so good for databases we should think about doing everything from ETL to everything else in the information chain there. One word critique: Oy.

In-memory analytics has been great for very fast processing. Having the data resident in memory is obviously a great way of providing rapid response for users of reports and analytic tools. However, it’s no panacea.

Simply put, two demands are limiting the cost effectiveness and even ability to do in-memory analytics: The amount of data and the number of users.

One of the repeated refrain of in-memory proponents is “memory is cheap!” Yes it is. However, massively parallel servers with the ability to efficiently link multiple cpus to large amounts of memory while providing coherence for multiple users aren’t. They quickly get very expensive with the costs of high end machines being much more on a pure memory amount level than commodity servers. There’s also an upper bound and with much of the larger data analytics today, multiple servers will be needed.

The other issue is the growth of self-service BI and mobile access to reports means that more memory is needed for non-database usage. A number of in-memory solution providers tell you that each user takes space in memory to satisfy individual needs. The more users the more space is taken from database availability.

The growth of server farms, being created in the Cloud now, is how the blend of in-memory versus space requirements will be addressed. “Fast enough” matters more than millisecond response time. With what we are constantly learning about both data manipulation and presentation, the strongest Cloud providers will win by keeping the most used information in-memory and sharing the rest among caches on multiple servers.

In-memory isn’t new and needs are much different than when it was. Listen to people talk about it and pay attention: If it’s the only thing discussed, they’re not being honest; if it’s a core part of the solution with its caveats addressed, the vendor, analyst or pundit is helping you.

BI is Dead! Long Live BI!

In memory! Big Data! Data Analytics! The Death of BI!!!!

We in technology love the new idea, the new thing. Often that means we oversell the new. People love to talk about technology revolutions, but those are few and far between. What usually happens is evolution, a new take on an existing idea which can add new value to the same concept. The business intelligence space is no more immune to the revolution addiction that is any other area, but is it real?

Let’s begin to answer that by starting with the basic question: What is BI? All the varied definitions come down to the simple issue that BI is an attempt to better understand business through analyzing data. That’s not a precise definition but a general one. IT was doing business analysis before it was called that, it was either in individual systems or ad hoc through home grown software. Business intelligence became a term when MicroStrategy, Cognos, Business Objects and others began to create applications that could combine information across systems and provide a more consistent and consolidated view of that data.

Now we have a new generation of companies bringing more advance technologies to the fore to improve data access, analysis and presentation. Let’s take a look at a few of the new breed and understand if they’re evolution or revolution.

Some Examples

In-Memory Analysis

As one example, let’s discuss in-memory data analysis as headlined by QlikTech, Tableau and a few others. By loading all data to be analyzed into memory, performance is much faster. The lack of caching and other disk access means blazingly fast analytics. What does this mean?

The basic response is that more data can be massaged without appreciable performance degradation. We used to think nothing of getting weekly reports, now a five second delay from clicking a button to seeing a pie chart is considered excessive. However, is this a revolution? Chips are far faster than they were five years ago, not to mention twenty. However, while technologies have been layered to scale more junctions, transistors, etc, into smaller spaces, it’s the same theory and concept. The same is true of faster display of larger data volumes. It’s evolutionary.

Another advantage of the much faster performance is that you can do calculations that are much more complex, from simple pivot tables to being able to handle scatter charts with thousands of points is a change in complexity. However, it’s just another way of providing business information to business people in order to make decisions. It’s hard to find a way to describe that as revolutionary, yet some do.

Big Data

Has there been a generation of business computing that hasn’t thought there was big data? There’s a data corollary to Parkinson’s law which says that data expands to fill available disk space. The current view of big data is just the latest techniques that allow people to massage the latest volumes of data being created in our interconnected world.

Predictive Analytics

Another technical view of business is because software has historically been limited to describing what’s happened that business leaders weren’t using it for predictive analysis. Management has always been the combination of understanding how the business has performed and then predicting was would be needed for future market scenarios in order to plan appropriately. Yes, there are new technologies that are advancing the ability of software to provide more of the analytics to help with prediction, but it’s a sliding of the bar allowing software to do a bit more of what business managers had been doing through other means.

But there’s something new!

Those new technologies and techniques are all advancing the ability for business people to do their jobs faster and more accurately, but we can see how they are extensions of what’s happening. Nobody is saying “Well, I think I’ll dump my old dashboards and metrics and just use predictive analysis!” The successful new vendors are still providing everything described in “legacy BI systems.” It’s extension, not replacement – and that’s further proof of evolution in action.

Visio (ask your parents…) no longer exists, while Crystal Reports (again…), Cognos, Business Objects and others have been absorbed by larger companies to be used as a must-have in any business application. In the same way, none of the current generation of software companies is saying that the latest analytics techniques are sufficient. They all support and recreate the existing body of BI tools into their own dashboards and interfaces. The foundation of what BI means keeps expanding.

“Revolutionary”?

It’s really very simple, technology companies tend to be driven by technologists. Stunning, I know. Creating in-memory analytics was a great piece of technical work. It hadn’t been successfully done until this generation of applications. We knew were where getting more and more data accumulated from the internet and other sources, then some brilliant people thought about some algorithms and supporting technologies that let people analyze the data in much faster ways. People within the field understood that these technical breakthroughs were very innovative.

On the other side, there’s marketing. Everyone’s looking for the differentiator, the thing that makes your company stand out as different. Most marketing people don’t’ have the background to understand technology, so when they hear that their firm is the first, or one of the first, to think of a new way of doing things, they can’t really put it into context. New becomes unique becomes revolutionary. However, that’s dangerous.

Well, not always.

To understand that, we must refer to one of the giants in the field, Geoffrey Moore. In “Crossing the Chasm” and “Inside the Tornado,” Moore formally described something that many had felt, that there’s a major difference between early adopters and the mass market. There’s a chasm that a company must cross in order to change from addressing the needs of the former to the needs of the later.

The bleeding edge folks want to be revolutionary, that they’re visionaries. Telling them that they need to spend a lot of money and a lot of sleepless nights on buggy new software because it’s a step in the right direction doesn’t work quite as well as telling them it’s a revolutionary approach. As the founders like to think of themselves in the same place, it’s a natural fit. We’ve all joined the revolution!

There’s nothing wrong with that – if you are prepared for what it entails. There are people who are willing to get the latest software. Just make sure you understand that complexity and buggy young code doesn’t mean a revolutionary solution, just a new product.

Avoiding Revolution

Then there’s the mass market. To simplify Moore’s texts, business customers want to know what their neighbors are doing. IT management wants to know that their always understaffed, underfunded and overworked department isn’t going to be crushed under an unreasonable burden. Both sides want to know their investment brings a proper ROI in an appropriate time frame. They want results with as little pain as possible.

We’ve seen that the advances are in BI, not past BI. Yet so many analysts are crying the death of something that isn’t dying. How does that help the market or younger companies? It points people in the wrong way and slows adoption.

Not all companies have fallen for that. Take one of the current companies making the most headway with new technologies, Tableau Software. They clearly state, on their site, “Tableau is business intelligence software that allows anyone to connect to data in a few clicks, then visualize and create interactive, sharable dashboards with a few more.” They have not shied away from the label. They’re not trying to replace the label. They’re extending the footprint of BI to do more things in better ways. Or QlikTech’s self-description “QlikTech is the company behind QlikView, the leading Business Discovery platform that delivers user-driven business intelligence (BI).”

There are many companies out there who understand that businesses want to move forward as smoothly as possible and that evolution is a good thing. They’ll talk to you about how you can move forward to better understand your own business. They’ll show you how your IT staff can help your business customers while not becoming overwhelmed.

Know if you’re willing to be an early adopter or not, then question the vendors accordingly. As most of you are, purely by definition, in the mass market, move forward with the comfort that BI is not going anywhere and plenty of companies are merging the old and new in ways that will help you.

To paraphrase Mark Twain, the reports of BI’s death have been greatly exaggerated.

1010data at the BBBT: Cool technology without a clear strategy

The presenters at last Friday’s BBBT session were from 1010data. The company provides some complex and powerful number crunching abilities through a spreadsheet paradigm. As with many small technical companies, they have the problem of trying to differentiate between technology and a business solution.

Let’s start with the good side, the engine seems very cool. They use distributed technology in the Cloud to provide the ability to rapidly filter through very large data sets. It’s no surprise that their primary markets seem to be CPG and financial companies, as they are dealing with high volumes of daily data. It’s also no surprise because they must have very technical business users who are used to looking at data via spreadsheets.

The biggest problem is that spreadsheets are ok for looking at raw data, but not for understanding anything except a heavily filtered subset of it. That’s why the growth of BI has been in visualization. Everything 1010data showed us involved heavy work with filters, functions, XML and more. The few graphics they showed look twenty years out of data and quite primitive by modern standards. This is a tool for a power user.

Another issue, showing the secondary thought given to re-use and display of information is their oxymoronic QuickApps. As the spreadsheet analysis is done in the cloud on the live data set, if someone wants to reuse the information in reports a lot of work must be done. The technical presenter was constantly diving into complex functions and XML code. That’s not quick.

When asked about that, the repeated refrain was about how spreadsheets are everywhere. True, but the vast majority of Microsoft Excel™ use no functions or the very simplest of sum() and a few others. Only power users create major results and BI companies have grown over the move from Excel to better ways of displaying results.

I must question whether CEO and Co-founder Sandy Steier understands where the company fits into the BI landscape. He constantly referred to Cognos and MicroStrategy as if they’re the current technology leads in BI. Those solutions are good, but they are not the focus of conversation when talking about the latest in visualization or in-memory technologies. The presentation did have one slide that listed Tableau, their web site was devoid of references to the modern generation (or it was well hidden). Repeated questions about relationships with visualization vendors were turned off to other topics and not addressed.

Of key focus was an early statement by Mr. Steier that data discovery is self-service reporting. There seems to be the typical technical person’s confusion between technology and business needs. Data discovery is the ability to understand relationships between pieces of data to build information for decision making. Self-service reporting is just one way of telling people what you’ve discovered. Self-service business intelligence is a larger issue that includes components from both.

I very much liked the technology but I must question if the management of 1010data has the vision to figure out what they want to do with it. Two, of many possible, options show the need for that choice. First, they can decide to be a new database engine, providing a very powerful and fast data repository from which other vendors can access and display information. Second, they can focus on adding real visualization to help them move past the power users so that regular business users can directly leverage their benefits. The two strategic choices mean very different tactics necessary for implementation.

To summarize: I was very impressed with 1010data’s technology but am very concerned about their long term potential in the market.

ETL across the firewall: SnapLogic at the BBBT

SnapLogic presented at the BBBT last Friday. I was on the road then so I watched the video today. The presentation was by Darren Cunningham, VP Marketing, and Craig Stewart from product management. It was your basic dog and pony show with one critical difference for the BI space, they understand hybrid systems.

Most of the older BI vendors are still on-premises and tip-toeing into the Cloud. Most of the newer vendors are proudly Cloud. The issue with enterprises is that they are clearly in a strong, hybrid situation with a very mixed set of applications within and outside the firewall. Companies talk about supporting systems in a hybrid system, but you dig down and find out it’s one way or the other, with minimal thought given to supporting the other half.

Darren made it clear from the beginning that SnapLogic understands the importance of a truly hybrid environment. They are, ignoring all the fancy words, ETL for a hybrid world. They focus on accessing data equally well regardless of on which side of the firewall it resides. Their partner ecosystem includes Tableau, Birst and other BI vendors, while SnapLogic focuses on providing the information from disparate systems.

Their view was supported by a number of surveys they’d performed. While the questions listed had the typical tilt of company offered surveys, they still provided value. The key slant, that has implications for their strategic planning, is shown by one survey question on “Technical Requirements of a Cloud Integration Platform.” “Modern scalable architecture” came in first while “Ease of use for less technical users” was third.

As Claudia Imhoff accurately pointed out, the basic information might be useful, but it’s clear om their presentation that this was an IT focused survey and should be treated as such. It would be interesting to see the survey done similarly for both IT and business users to see the difference in priorities.

SnapLogic looks like they have a good strategy, the thing to watch is how they grow. The key founder is Gaurav Dhillon, one of the founders of Informatica. He had a good strategy but was replaced when the company grew to a point where he couldn’t figure out the tactics to get over the Chasm (full disclosure: I worked at Informatica in1999, when it hit the wall. I’m not unbiased). Let’s hope he learned his lesson. There’s a clear opportunity for SnapLogic’s software, and it seems to be going well so far, but we’ll need to watch how they execute.

TDWI: Evolving Data Warehouse Architectures in the Age of Big Data Analytics

Today’s TDWI webinar was a presentation by Philip Russom, Research Director, on “Evolving Data Warehouse Architectures in the Age of Big Data Analytics.” It was based on his just released research paper of the same name. Overall, it was a good overview of where their market survey shows the market, but I have a nit.

One of the best points Mr. Russom made was during his discussion of responses to “What technical issues or practices are driving change in your DW architecture. The top response, with 57%, was Advanced Analytics. Phillip pointed out to everyone that’s about as good a proof as you get in business driving technology. Advanced analytics is a business need and that pushing the technical decision. Too many people get too wrapped up in technology to realize it exists to solve problems.

TDWI what is data warehouse architectureAnother key point was made as to the evolving nature of the definition of data warehousing. Twenty years ago, it was about creating the repository for combining and accessing the data. That is now definition number three. The top two responses show a higher level business process and strategy in place than “just get it!”Where I have a problem with the presentation is when Mr. Russom stated that analytics are different than reporting. That’s a technical view and not a business one. His talk contained the reality that first we had to get the data, now we can move on to more in depth analysis, but he still thinks they’re very different. It’s as if there’s a wall between basic “what’s the data” and “finding out new things,” concepts he said don’t overlap. Let’s look at the current state of BI. A “report” might start with a standard layout of sales by territory. However, the Sales EVP might wish to wander the data, drilling down and slicing & dicing to understand things better by industry in territory, cities within and other metrics across territories. That combines what he defines as separate reporting and data discovery.

Certainly, the basic row-based reporting can switch to columnar structures for better analytics, but that’s techie. The business user sees a simple continuum. Like most other areas where technology advances (pretty much all of life…), business software solves a problem and then allows people to see past that one to the next. That successor problem doesn’t have to be completely different, and in the case of reporting and analytics, it’s not.

The final takeaway I received from his webinar helped support that concept even if his earlier words didn’t. He talked about the Multi-Platform Data Warehouse Environment, the fact that DW’s aren’t going anywhere, they’re only being incorporated into a wider ecosystem of data technologies in order to continue to improve the understanding and decision making capabilities of business managers.

Other than the disagreement I have with his view of reporting and analytics, I heard a good presentation and suggest people check out the full report.

Qlik at BBBT. QlikView Expressor: Come for the ETL, stay for the Data Governance

Last Friday’s BBBT presentation was by Qlik and the primary purpose was to discuss QlikView Expressor, but that was just a foundation for what really caught my eye, a great advance in data governance.

Qlik bought Expressor Software last June and the presentation was the chance to show the re-branded products to the analysts in the group. Expressor bring baby ETL to Qlik. The presenters, Donald Farmer and Bill Kehoe were very honest and clear that the product is intended for those who start with basic self-service BI and find they need to get to multiple sources as they begin to expand their use. I’ll be clear, this is a baby product. Their public case study was, according to the slides, using ODBC and Salesforce.com’s very open API. How they can, and even if they should, handle access to more complex and proprietary systems remains a big question.

As Informatica and other major players in the ETL space have strong partnerships with Qlik, it’s a careful game Qlik has to play. On one side, they have to provide some basic ETL functionality to a key portion of their market, on the other side they have to not alienate the big players. Often products acquired in such a middle ground either fail from the lack of a clear solution or cause problems with partnerships, but only time will tell how Qlik will handle this. For the time being, I don’t see this product being a threat to their partners.

The presenters waffled the early message about ETL being a way to govern access to data and why became very clear as the presentation entered the second section. QlikView Expressor is being used as a component driving Qlik’s new QlikView Data Governance Dashboard. The company has done an amazing job at blending ETL, their existing and well known BI presentation software, and a smart overview of the full architecture to take a very good step forward in helping companies understand where their data is being used.

As Donald Farmer pointed out, only half humorously, “Microsoft Office for the iPad has killed data governance.” KPIs defined in multiple departments, different reports on different computers and the growth of laptops made data governance difficult in the previous decades. The boom in tablet use has expanded that challenge exponentially. Having the leading business productivity suite now available on the leading tablets means company reports, spreadsheets and more are spread even further through the business ecosystem. Data governance becomes vastly more difficult to achieve.

The Data Governance Dashboard is a first step in helping IT and business users understand what information is out there, where it’s residing and how much of it is used how often.

This isn’t a blog about the features, but one must be mentioned because it is critical. Knowing what data fields are being accessed by what BI reports is important by itself, and Qlik and others have been looking at that. The extension that matters is their reports that begin to link the report labels users see to the internal fields. Think about the ability to see that two divisions both use the label “Gross Profit” and then understand that they’re using different fields, definitions and underlying data to create the displayed numbers.

Self-service BI is reaching the point where desktop computers were in the early 1990s. The 1980s saw business users run away from what they saw as a controlling IT infrastructure. It helped and it created confusion. IT and business had to find new ways to work together, reintegrate and help both sides help the company. The Governance Dashboard is something that can help lead the BI industry into that same step to help both sides provide improved and consistent information to business decision makers. Well done.

TDWI Webinar – Preparing Data for Analytics with Liaison Technologies

Tuesday’s TDWI Webinar was titled “Preparing Data for Analytics,” but that’s not what it was about. It was still interesting, just misnamed. The focus was, as would be expected with Liaison being the sponsor, about how managing data in the Cloud can enhance the ability of some companies to support BI.

It started with Phillip Russom, an in-house TDWI analyst talking a bit about preparing data without ever using the words extraction and transformation. The most interesting point he made was an aside in the main presentation, but one that should be brought to the fore: What’s SMB?

Most of us think of SMB as Small-to-Medium sized Businesses. His point was that it’s really Small-to-Medium sized Budgets. Many folks involved in enterprise software understand that you often don’t sell an enterprise license up front. A software vendor will sell to a department or a business division with a smaller budget that needs to get something done. Then the tactic is to expand within the enterprise to build it into a major account. Mr. Russom makes the great point that the tactics to sell into the smaller groups in an enterprise are very similar to those uses to sell to smaller businesses, so maybe there are opportunities being left on the table by smaller vendors.

His other key point needs a webinar of its own. He mentioned that companies looking for Cloud analytics should “make sure Cloud solutions are optimized for data and not for applications.” It’s the data that’s important, how you get it and how you prepare it. That’s what has to be supported first, then applications can access the data. Sadly, he said that and moved on without any real details on what that preparation means. I’d like to see more details.

The main speaker was Alice Westerfield, Sr. VP of Enterprise Sales at Liason Technologies. Her main point followed Russom’s lead in by pushing that a good analytics platform requires moving from an application centric approach to a data centric one. No surprise, Liaison has one handy. Most importantly, it’s a Cloud approach, one they’ve been offering before Cloud became the buzzword.

Alice was brief but focused on their history of supporting integration in the Cloud. The four main benefits she mentioned were:

  • Data Integration
  • Data Transformation
  • Data Management
  • Data Security

That makes sense and we all know how important the last one, security, matters before people are willing to perform the first three on the Cloud. However, it’s the changing nature of the data game that means I want to focus on the first, data integration.

While Liaison talks about the benefits of leveraging their years of integration skills rather than reinventing or needing to take new integrations and install them in an on-premises solution, there’s another Cloud aspect that I think is critical. Most businesses use a mix of applications and many are already on the Cloud. Add to that the mobile nature of today’s generation of BI solutions which are provided in the Cloud. It makes sense for many SMBs to leverage that. Why take data from Cloud apps, move them on-premises and then move them back to the Cloud? Using a service such as Liaison simplifies and speeds the process of meshing data from within and outside of the firewall and then providing wide access to knowledge workers through the new BI interfaces.

For the foreseeable future, there will continue to be reasons for keeping data within the firewall, but for most data and most companies, a solution such as Liaison’s would seem to be an opportunity to quickly integrate data and share it as broadly as required.

Tableau Software at the BBBT

Years ago, I had a great boss. Very smart, good people manager, understood strategy and tactics. There was one flaw. One joke was that head never use a sentence if a paragraph would suffice. That showed most clearly is the voluminous number of slides that would be part of any presentation.

He comes to mind because I attended Tableau Software’s presentation to the Boulder BI Brain Trust (BBBT). 93 slides. Yes, 93. Even if folks were not familiar with BI, I think the presentation would have been a bit, shall we say, extensive. Just like that boss, the presenters had a lot of very interesting to say, but finding it was often hidden in too much explanation. So I don’t make the same mistake while describing the presentation, here are my key takeaways:

  • Balanced on-premises and Cloud strategy.
  • Strong forethought about changing relationship between IT & business users.

Everyone even vaguely involved in BI should already be familiar with the basics of Tableau. It is one of the leaders in the latest generation of BI solution providers. They are known for their visualization skills yet haven’t scrimped at the technology needed to manage large data sets. They had a tag team of four presenters for the presentation. The first presenter was Suzanne Hoffman, Sr. Director of Analyst Relations, who gave the overview of company growth since last year’s BBBT presentation.

Francois Ajenstat, Sr. Director, Product Management, then followed with, as would be expected from product management, a review of products including a discussion of v8.1. In a review of a number of things, the one item that caught my attention was the extension of their Cloud packages. It was clearly stated that the product wasn’t redesigned for the Cloud, just moved outside the firewall to a hosted environment, but that’s a fine first step considering the rest of the work they’ve done. What’s critical is they’ve thought through the Cloud business model and how that is different than what’s needed for purchased software.

The one issue I have, from an analyst perspective, was the very brief discussion of the supposed success and early adoption of their 8.1 offering. While describing the number of companies who have rapidly adopted it, there was no breakdown. How many were new customers versus old? Were they larger or smaller companies? While it is to be expected that newer companies and departments, as well as smaller companies, would be the first to adopt, and while that’s not necessarily an issue, clarity would have helped.

In a look ahead about v8.2, which we were told is not under NDA was a look at what seems to be an extremely professional Mac port. It’s not just moving what was on MS Windows to the Mac, as many companies do. It looks like a product designed for the Mac that will make very happy any user of that platform. While Francois did a good job explaining why they wanted to port to a platform that has 10% of the business market, I think Tableau should be more explicit on who that 10% is and why that matters. Marketing. Yes, I know, I’m biased towards marketing, but think about the fact that marketing was the home of the Mac during the lean years. Why? Because the Mac has always been better for visualization, so marketing continued to use the platform for creating ads, graphics, charts and more. Providing a great Mac port enhances the ability of marketing to serve good analytics and presentations to the rest of the firm.

The presentation was then handed to Robert Kosara, a research scientist, to talk about visual storytelling. Here, again, there were too many examples and a blurred message. While trying to describe how the concept was different than the visualization theories that have happened from Tufte onwards, the difference wasn’t made very clear. There seemed to be two main takeaways. First, they’re only describing the fact that modern BI allows isolation of and drill down into segments of the visualization, things that aren’t possible in old-line static print medium. That seems intuitively obvious rather than a giant leap forward. The second part of it isn’t with theory but with some of Tableau’s plans (again, we were told that this level isn’t under NDA) to provide technology that merges visualization with presentation layout, where the application will be able to take the equivalent of slides of the visualization to increase the presentability of analysis. That’s still early in work but could be very interesting as it matures.

The final presenter was Ellie Fields, VP of Product Marketing. Maybe I’m biased, as my background is in product marketing, but I wish this section had not been last. The most important thing is her presentation was what I think best brings out why Tableau is a leader. Her section focused on the changing environment for BI implementations. Everyone in the industry acknowledges the power of the BI front-end is giving business users more and more ability to be involved in creating and managing their own analytics. We know that IT is going to have to change from a more controlling focus to one of service. Ellie Fields has thought this through and clearly defined how the change doesn’t obviate the need for IT but only changes the relationship between the two sides. The most important slide in the deck is the one below, where a high level description of that partnership is presented.

Tableau IT and Biz partnership

The past growth of Tableau Software has clear reasons. It was a good technology introduced to the market at the right time. What they’ve done so far is a sign of management’s ability to understand an early market. Continued growth is across Moore’s Chasm and a longer term strategy is critical. In my (probably not so humble) opinion, while technology is critical there are far too many technology companies that failed with great products. It’s the vision shown in both Francois Ajenstat’s discussion of managing both on-premises and Cloud markets combined with Ellie Field’s understanding of how to position the company to help businesses through a major change in how IT and business users will relate that show the strength of the company moving forward.

I just wish we heard that information bubbled and  more visible rather than lost in a bunch of other words that weren’t really needed.