Category Archives: Business Intelligence

Tableau Software at the BBBT

Years ago, I had a great boss. Very smart, good people manager, understood strategy and tactics. There was one flaw. One joke was that head never use a sentence if a paragraph would suffice. That showed most clearly is the voluminous number of slides that would be part of any presentation.

He comes to mind because I attended Tableau Software’s presentation to the Boulder BI Brain Trust (BBBT). 93 slides. Yes, 93. Even if folks were not familiar with BI, I think the presentation would have been a bit, shall we say, extensive. Just like that boss, the presenters had a lot of very interesting to say, but finding it was often hidden in too much explanation. So I don’t make the same mistake while describing the presentation, here are my key takeaways:

  • Balanced on-premises and Cloud strategy.
  • Strong forethought about changing relationship between IT & business users.

Everyone even vaguely involved in BI should already be familiar with the basics of Tableau. It is one of the leaders in the latest generation of BI solution providers. They are known for their visualization skills yet haven’t scrimped at the technology needed to manage large data sets. They had a tag team of four presenters for the presentation. The first presenter was Suzanne Hoffman, Sr. Director of Analyst Relations, who gave the overview of company growth since last year’s BBBT presentation.

Francois Ajenstat, Sr. Director, Product Management, then followed with, as would be expected from product management, a review of products including a discussion of v8.1. In a review of a number of things, the one item that caught my attention was the extension of their Cloud packages. It was clearly stated that the product wasn’t redesigned for the Cloud, just moved outside the firewall to a hosted environment, but that’s a fine first step considering the rest of the work they’ve done. What’s critical is they’ve thought through the Cloud business model and how that is different than what’s needed for purchased software.

The one issue I have, from an analyst perspective, was the very brief discussion of the supposed success and early adoption of their 8.1 offering. While describing the number of companies who have rapidly adopted it, there was no breakdown. How many were new customers versus old? Were they larger or smaller companies? While it is to be expected that newer companies and departments, as well as smaller companies, would be the first to adopt, and while that’s not necessarily an issue, clarity would have helped.

In a look ahead about v8.2, which we were told is not under NDA was a look at what seems to be an extremely professional Mac port. It’s not just moving what was on MS Windows to the Mac, as many companies do. It looks like a product designed for the Mac that will make very happy any user of that platform. While Francois did a good job explaining why they wanted to port to a platform that has 10% of the business market, I think Tableau should be more explicit on who that 10% is and why that matters. Marketing. Yes, I know, I’m biased towards marketing, but think about the fact that marketing was the home of the Mac during the lean years. Why? Because the Mac has always been better for visualization, so marketing continued to use the platform for creating ads, graphics, charts and more. Providing a great Mac port enhances the ability of marketing to serve good analytics and presentations to the rest of the firm.

The presentation was then handed to Robert Kosara, a research scientist, to talk about visual storytelling. Here, again, there were too many examples and a blurred message. While trying to describe how the concept was different than the visualization theories that have happened from Tufte onwards, the difference wasn’t made very clear. There seemed to be two main takeaways. First, they’re only describing the fact that modern BI allows isolation of and drill down into segments of the visualization, things that aren’t possible in old-line static print medium. That seems intuitively obvious rather than a giant leap forward. The second part of it isn’t with theory but with some of Tableau’s plans (again, we were told that this level isn’t under NDA) to provide technology that merges visualization with presentation layout, where the application will be able to take the equivalent of slides of the visualization to increase the presentability of analysis. That’s still early in work but could be very interesting as it matures.

The final presenter was Ellie Fields, VP of Product Marketing. Maybe I’m biased, as my background is in product marketing, but I wish this section had not been last. The most important thing is her presentation was what I think best brings out why Tableau is a leader. Her section focused on the changing environment for BI implementations. Everyone in the industry acknowledges the power of the BI front-end is giving business users more and more ability to be involved in creating and managing their own analytics. We know that IT is going to have to change from a more controlling focus to one of service. Ellie Fields has thought this through and clearly defined how the change doesn’t obviate the need for IT but only changes the relationship between the two sides. The most important slide in the deck is the one below, where a high level description of that partnership is presented.

Tableau IT and Biz partnership

The past growth of Tableau Software has clear reasons. It was a good technology introduced to the market at the right time. What they’ve done so far is a sign of management’s ability to understand an early market. Continued growth is across Moore’s Chasm and a longer term strategy is critical. In my (probably not so humble) opinion, while technology is critical there are far too many technology companies that failed with great products. It’s the vision shown in both Francois Ajenstat’s discussion of managing both on-premises and Cloud markets combined with Ellie Field’s understanding of how to position the company to help businesses through a major change in how IT and business users will relate that show the strength of the company moving forward.

I just wish we heard that information bubbled and  more visible rather than lost in a bunch of other words that weren’t really needed.

Kalido: A solution in search of the message?

I had the fortune to see Kalido presentations twice in two days. First, was the Qlik road show event on Thursday and second was the Boulder BI Brain Trust call on Friday.

Kalido provides a streamlined way to create and manage data warehouses. The key seems to be a strong data modeler linked to the engine, providing a more graphical way to define processes and link that to the automatic creation of the physical data layers and data warehousing management. According to their case studies, the result is a significant savings in time to deploy and manage warehouses. As they pointed out in the BBBT presentation, the deployment savings is a clear and compelling argument but the longer term saving in ongoing operational costs is one they haven’t yet successfully attacked.

That ties in to the issue of their major message to the Qlik audience and on their web site: “No ETL!” As anyone who understands their technology knows, and as they pointed out in their BBBT presentation, ETL is one component of their solution. The presenter on Thursday tried to claim it’s not ETL, it’s ELT, because they use a temporary data store to more quickly extract information from operational systems, but that’s not going to cut it. ETL is still performed even if in a slightly different order. IT people will understand that and laugh at the claim while most BI business users won’t know what that means and the rest won’t care as it’s not a major concern to those people trying to get information out of the warehouse.

Operational costs matter to both IT and the business line managers. As many IT centers internally “bill” divisions for costs, that will still have an impact and matter to both sides more than a specious ETL message.

More importantly, the ability to change your business model and have it rapidly be reflected in the data warehouse is of strong value to the decision makers. The ability to eliminate 6-9 months of rework before a change is done only to see the changes now be out of date has a clear and compelling message for business decision makers. The ability to rapidly satisfy business users in changing markets while using less IT resources is valuable to the IT organization.

So why does the message seem to be missing a great market focus opportunity? One possible answer is found on their executive team page. Rather, it’s what’s not found. A company that wants to leave the startup phase and address a wider market would do well to emphasize marketing with the same importance as engineering and professional services. Products aren’t enough, you have to create messages that address what interest stakeholders. Kalido seems to have a very good product, but they aren’t yet able to create messages to address the wider market.

Salesforce, BI, analytics and Birst

This is the third and final blog in my series about last week’s Salesforce1 event in Philadelphia.

I’ve discussed the growth of the Salesforce ecosystem and how the company is bringing back the integration because SFA, CRM and help desks to ensure customer facing systems work together. They also have partners with applications such as accounting, quoting and other add-ons to the full process. The final big issue is how that needs to change reporting.

It’s easy to create charts and desktops for single applications focused on a department or function. That’s not why business intelligence exists. Upper management needs to better understand how their organizations are performing, and that means analyzing information that comes from multiple systems. BI’s goal is to provide actionable information about the business. That’s why it grew alongside data warehouses that gathered information from disparate systems. Now that Salesforce is clearly creating an integrated environment, are their analytics growing apace? Sadly, no.

“Analytics Unleashed” was the key session that interested me above all others. However, to paraphrase Gertrude Stein, there wasn’t much there there. It looked like simple reports slapped into basic dashboards with no real drill down or data discovery capability. Admittedly, the major part of their demonstration didn’t work and it might have been in that, but I didn’t see anything on in the exhibit hall to make me think there was.

Well, not from Salesforce directly, but don’t be without hope. I’ve talked about the ecosystem they’ve clearly built with Salesforce1, and that comes with advantages. One of those advantages is Birst, a business intelligence firm.

Birst is a Salesforce partner who, unlike a number of other partners at the road show, isn’t exclusively focused on Salesforce. They are in the new generation of BI firms who are working to modernize the market. They work to eliminate the need to have separate ETL and BI vendors, providing a platform to integrate that then directly supports leading edge analytics capabilities.

That ability combines with a SaaS architecture to allow Birst to work well with Salesforce1. I talked with them at their stand and the integration seemed clean from a UI standpoint. I’d have to dig deeper to understand how well Birst helps Salesforce integrate with other systems, but things look hopeful.

Salesforce seems to be viewing BI the same way as the other large companies. Even with their more modern history, advanced SaaS architecture and appropriately Web nimble interfaces, Salesforce still thinks of analytics as reporting. If “we” get all applications under our control, we just extend our reporting. That’s not BI. Analytics is a more advanced way of providing better information for decision making.

At some point, I’m sure Salesforce will come to that understanding, and then who knows what will happen. Until then, Salesforce customers can still access the next generation of BI and analytics through the ecosystem that allows Salesforce and Birst to work in concert.

Salesforce1 and “The Internet of Customers”

This is the second part of a series on the Salesforce1 road show held in Philadelphia on March 6, 2014.

One key point repeated throughout the road show event was that behind every device in The Internet of Things is a customer. Good point. The question is if Salesforce is beginning to truly address that and not just prospects.

In the early 1990s, I worked at Aurum Software, one of the companies that created the market. In those days, all the companies, Scopus, Vantive and a few others were trying to address the full customer interface. We had a single database and built early SFA, CRM and customer support applications on top of the database. Unfortunately, neither the hardware or software allowed for both quick development and the number of users supported from all the branches. In addition, since the areas were all pretty new, most prospects wanted to start with one of the three applications. That caused the three branches to split up and they’ve been separate for most of the last twenty years.

Salesforce is one of the companies working to bring it all together again. As one of the oldest and most successful of the SaaS companies, they’ve been focused on the power of the cloud and how to expand. In the road show, they did a good job of showing SFA, CRM and customer support applications working in concert to benefit companies wanting to understand the true picture of their interactions with their customers.

The company has acquired ExactTarget and Pardot to provide the confusing pair of automated marketing and marketing automation. ExactTarget provides list management, schedule and automated distribution of emails (automated marketing). Pardot helps marketing manage campaigns and lead through those campaigns (marketing automation). I was very impressed by both the Pardot software and team. The software works well on its own and has tight integration with Salesforce SFA, though there’s still work to do in order to show good closed-loop reporting.

On the support side, Salesforce advertises their Service Cloud, supposedly powered by Desk.com. I’m not as impressed by this as I am the CRM, but the strategy is in place. In addition, having designed for the internet, SaaS has a good partner ecosystem with a number of products and SI’s at the roadshow to help add service to the other customer facing systems.

The technology is finally in place to build integrated systems that can give management a view of all their customer interactions, and Salesforce has the strategy in place to achieve that. However, as they’ve been focused on acquiring the pieces and building the integration, the one piece still missing is true business intelligence, and that will be the focus of my next post.

SFA & CRM for BI: Sales and Marketing redux

A few weeks back, I blogged about the necessity for sales and marketing to work in a symbiosis rather than a power struggle. This post is about what that means for SFA and CRM.

In the early 1990s, I worked at a startup named Aurum Software. It was one of the first companies to provide CRM solutions. On a humorous tangent, I’m happy CRM won. Our founder came from the ERP space and he had us call the solution Customer Resource Planning no matter how many times we told him CRP just wasn’t as polite an acronym as competition’s CRM.

One of the keys to the early success of Aurum and its competitors is that the core databases and software were set up to manage all customer facing applications. That mean we had sales, marketing and help desk front ends using the same database, so management could see a full picture of customer interaction.

Unfortunately, it was at a time when both software and hardware were much slower and CRM, as it still is, was very data intensive. To get people going, most ISVs concentrated on one of the three aspects and the split between CRM and SFA was born.

The result, over the past twenty years, was that the two systems rarely talked well with each other. Sales personnel would track their prospects through the funnel, marketing would track prospect and customer touch points in communications, but they were rarely linked and poorly linked if anything.

I worked with multiple companies who couldn’t sales and marketing information with any degree of business intelligence. One would just import the SFA’s first contact and that would always be the sole “lead source” tracked. Another did the opposite, the first campaign that ever received a ping from someone at a company was always considered to be the lead source for campaign analysis. It didn’t matter at either firm that the contact was a couple of years ago and a recent contact (sales or marketing) restarted the process.

One company tried to close the loop but didn’t know how. Marketing tracked campaigns in their CRM system and didn’t in their SFA system. A lead would be passed to sales, then that information would come back and the campaign information was no longer there and the loop couldn’t be closed.

What’s needed is for both sales and marketing to realize that BI requires a tighter link. Fortunately, the power of systems has advanced that better integration is starting to happen. In a bit of foreshadowing, I’ll say that I’m exciting to be going to Salesforce.com’s Salesforce1 World Tour tomorrow. Salesforce burst onto the scene years ago with a simple SaaS SFA platform, but it was clear they always intended more. Now, with the growth of their own CRM offerings, it’s starting to come together. I’m interested in seeing more, asking more, then blogging more.

 

BI: Expert Systems and Knowledge Engines. You say potato…

When I studied artificial intelligence (AI) in the mid-1980s, an argument was raging about whether or not it could ever be solved. That argument continues to occur, but it’s intriguing to understand why. It’s not that we have made no progress, it’s about the real, underlying definition of what we think of as AI. If both academics and practitioners were honest, they’d admit that the definition of AI is “Getting machines to do the rest of what we call human intelligence that we haven’t yet figured out how to do with machines.” Notice that it’s a negative definition, AI is what we still don’t know. Therefore, until we have AI it’s always unsolved.

Vision, robotics and other full-fledged disciplines were considered AI in the early days. When we understood the problems well enough to solve chunk or at least to investigate them as specific units of study, they became their own area. What was left was still AI, the “unsolvable problem.”

I bring this up because my area of interest in AI was expert systems. I had the honor of working on a project under Bruce Buchannan, one of the creators of Dendral and MYCIN (two of the earliest expert systems). It was a business application of expert systems, using rules to get a computer to plan and budget. Working to get the professor’s brain onto the white board is was what made me realize I wanted to be on the business side of computing rather than the computing side of business.

Expert systems as a whole were oversold, and earned a bad reputation. So, just as rap became hip hop, other terms were used. Today we talk about “rule based systems,” “knowledge systems” and other similar techniques to help analyze Big Data. Companies talk about “intelligent agents” for customer support and prospect advice. There are a plethora of terms to describe what are, essentially, expert systems.

Why does that matter? While we might be able to solve new problems in business, adding significant value to software, most of what we do is evolutionary rather than revolutionary. That’s a good thing, as IT and most of the mass market want to know that they can add new capabilities without having to spend time, money and mental anguish over “transforming your organization!”

When looking at  the new techniques for better understanding data, for predictive analysis and for other areas of business applications, know that knowledge systems have a long and strong history, regardless of a founder’s or a marketing organization’s addiction to a revolutionary message. Spend time to see if the vendor sees past his revolutionary message to the evolutionary solution needed by most firms.