Wednesday, October 29, 2008

Leave a Legacy (follow-up)

Earlier this month, I did a blog on Leaving a Legacy (BI product, that is). In particular, I talked about how many large companies still have old 4GL application development tools such as FOCUS, NOMAD, and RAMIS. Not only are there hundreds of firms with these 1970s- and 1980s-era 4GLs installed, most are probably still paying annual maintenance fees of 15% to 20% of the purchase price.

Following that post, I received inquiries into how difficult it really is to convert these products. Management within these organizations are demanding, "Just replace it -- it's only a reporting tool!" but their IT groups are struggling to comply with the mandate.

Many make the decision to convert to a data warehousing ad-hoc tool such as Business Objects, moving the data off the mainframe. That creates its own issues, but there are two main reasons why this does not work well. First, the 4GL is a full application development tool while Business Objects is an ad-hoc report writer. These are apples and oranges; the replacement does not have all the features of the original product and it is not the paradigm that the end-user expects.

Another reason is that the 4GL provides real-time access to mainframe data. A data warehouse tool does not. It only allows the user to see copied data; perhaps not up-to-date, maybe not to the level of detail they need, and often without all of the needed columns.

A better solution might be to keep things where they are and convert to a new mainframe BI technology such as WebFOCUS or the zLinux Cognos. At one mainframe client, we automated the 4GL conversion and dramatically reduced the time, cost, risk, and skill-set requirements by baking translation logic into a software utility. We also reduced the amount of effort by keeping all the back-end data the same by staying on the big box. To help with support issues, we switched to a new Unix System Services operating system.

Even a conversion utility is not a magic wand, of course, and your enterprise BI conversion initiative will still be difficult. If somebody is telling you otherwise, you want to be careful moving forward with their advice.

Thursday, October 23, 2008

Be Careful What You Blog

Just last week, I did a seemingly innocuous blog posting on hot SAP skills. SAP has been doing creative things to find more people to fill tens of thousands of SAP openings.

Well, SAP's creativity reached out and bit me.

While I was happily typing away in my blog, my director of marketing and sales was on the phone talking with SAP about a open position leading their North American adult learning programs. Unfortunately for me, this is a match made in heaven for him and SAP. Based on his background, he is a perfect candidate for the job.

I'm happy and wish him the best of luck training individuals to work in these important SAP positions. Perhaps I should do a blog on the hot openings within my own organization?

Wednesday, October 22, 2008

Cincinnati Fast 55!

I am happy to announce that the Business Courier of Cincinnati recently named my organization one of the 55 fastest growing privately-held businesses in the local area.

In just two years, we have grown to $6 million in annual revenue and have assembled an excellent staff of over a dozen individuals working in the corporate office and 40 consultants in the field. We have offices in both Cincinnati and Columbus as well as a consulting presence in Indianapolis. I want to thank everybody who has been involved in this success.

For more information on the honors, click on the picture to read the news article.

Business Courier of Cincinnati names PPS Finalist in their Fast 55!

Software Vendors, Repent!

John Thompson, the CEO of Symantec, is warning other leaders of software companies that they need to rethink their business models. He admits that it is not an immediate problem, but he sees something on the horizon for the legacy software industry.

In an interview with Financial Mail for their October 24th issue, John talked about the emerging trend of "software as a service" (or Saas) as opposed to the traditional way of selling shrink-wrapped products. Here is an excerpt from that article:

"[John] believes software companies don't need to start panicking - yet. In the long term, though, their profit margins will be eroded as the industry moves to a service model, he says. 'Software businesses have gross margins in the range of 80%-90%, while services businesses are in the range of 60%-65%. A well-run software company will have pretax operating margins of 25%-35% and a services business of about 15%. It's a very different model.'

Thompson says software companies have to reinvent themselves as services businesses, which will be difficult for many. But if they don't do it, they could find their future in doubt.

'Some company, without a legacy-installed base or a legacy revenue stream, is able to run up and take your business away from you,' Thompson says.

But the shift will take longer than many in the industry assume, he says. This will give the big software players time to adapt.

'The revenues of Saas companies will not overtake the revenues of [traditional] software [companies] for a long, long time," Thompson says. "It's not a big threat to the profitability of software companies in the near term.'

He says the annual turnover of the world's largest dedicated Saas vendor,, is dwarfed by Microsoft's yearly revenue. 'If [ CEO] Marc Benioff were sitting here, he'd say software's dead. That's just crap. Marc's a good friend of mine, but Marc is just promoting what he's doing.'"

Back in the days before companies could afford their own mainframe computers, people would connect up to somebody else's system and pay for the time they used it. Software vendors would install their applications there and let different customers access it on the shared system.

The 1970s relationship between Tymshare and Information Builders is a good example of how BI software was offered as a service at an early point in computing history. When platforms became affordable, vendors began selling products to each company individually.

John is right that software pricing models will definitely need to change. For example, Information Builders still prices their BI product based on the size of the hosting computer. Never mind that the development costs are the same for Windows, UNIX, and System z and that they only have a single code base now anyway. If you want to run their BI on your Windows box, the price tag is about $40,000, but tell them you want to run it on your mainframe and their number goes up to probably around $1 million.

Their pricing model is based on the assumption that the bigger your computer, the more business value you will get, so therefore you should pay more. They offer you a perpetual license, so you pay them 20% of a purchase price every year for annual maintenance ($8,000 for Windows and $200,000 for the mainframe) which allows you to always have the latest release. If you get a bigger machine after the BI software is installed, you are going to be found out and have to pay the infamous "upgrade fee."

Pure software vendors will pay attention to certain financial ratios, one of which is "revenue per headcount." They want to divide their annual revenue by the number of employees and come up with a ratio of about $250,000, but the higher the better. To improve their score, they only have two choices: either increase revenue or reduce the number of employees. If they have challenges on the revenue side, they will get rid of unnecessary employees (some go so far as to convert employees to partners or subcontractors so these individuals no longer impact the official ratio).

Software consulting firms are going to have much lower revenue per headcount ratios as they are limited by the number of hours in the year that any given employee can bill for services. Unlike the software vendors, the services organizations might be happy with a ratio of $150,000 to $180,000. Because Saas companies are not necessarily pure services businesses, they may be able to achieve a high ratio here, although they may have troubles attaining the high figures of today's pure software businesses.

For the sake of their customers and their own employees, hopefully legacy software vendors will pay attention to John calling out in the desert, "Repent!"

Wednesday, October 15, 2008

Human Decisions

In an earlier blog, I discussed how Howard Dresner says that his term "Business Intelligence" really only applies to those software applications that help humans make decisions.

Timo Elliott (Employee #8 at Business Objects) blogged on a similar topic back in 2007 with his "Will Computers Ever Help With Decisions?" article (I hate to give away the ending, but Timo answered his own question with "No"). It's an interesting blog about why people are still unhappy with their computer appplications, so be sure to read it.

Timo gives us a definition of "decision" itself:

"A decision is a situation where information is lacking by definition. Many executives define their jobs as "making decisions" -- i.e. tasks that can't be automated. As computers take on the lower-level tasks, they're free to move on to more complicated choices."

Like Howard, Timo points out a significant dividing line between applications that automate tasks and those that present information to humans for a final decision. It the computer has all of the information necessary to reach a conclusion, then the result cannot be considered a real "decision." Instead, it is an automated task.

Timo concludes his blog with:

"But to decide (and err) is human. Recognizing this and setting expectations appropriately can help smooth the relationship between the people that consume information and the groups that provide it."

Notice our natural biases both for and against the human race. We think that only humans can make decisions; computers are beneath us in that capacity. However, we're quick to admit that humans make mistakes, unlike computers which are inerrant (except for when software applications make the mistakes we accidentally tell them to make -- too bad computers cannot decide to not follow our instructions!).

Monday, October 13, 2008

Hot Jobs

Back in the late 1990s when Y2K was looming, people with SAP skills were in high demand. A decade later, SAP skills are still hot although with a slight twist. In the September 29th issue of InformationWeek, Marianne Kolbasuk McGee wrote a NewsFilter article on the topic:

"A scarcity of experienced talent combined with the growing popularity of SAP's products has pumped up pay considerably for those with SAP expertise, according to a report by research firm Foote Partners. Despite the weak U.S. economy, premium pay for a dozen SAP-related skills rose by as much as 30% over the past six months, and in some cases up to 57% in the last 12 months. The study evaluated the pay of 22,000 IT pros in the United States and Canada.

SAP is aware of this increasing demand and is moving to get more talent into the pipeline through alliances with universities, lest it lose potential customers to other ERP platforms, like Oracle, amid fears of a labor crunch."

In May of this year, Marianne had written that SAP was working hard to resolve this issue:

"SAP officials estimate there's a current shortage of 30,000 to 40,000 experts needed to meet the global demand for implementations and support of SAP software. And if you talk to employers looking to hire SAP talent, they'll tell you that finding these people is no easy mission."

While that much demand sounds like a great problem for a software vendor to have, Joe Westhuizen, an executive for SAP, says that "very quickly success can also become a noose around one's neck." Joe reports that the situation has actually improved, as just a year ago there were unfilled openings for over 50,000 SAP specialists worldwide. SAP is trying a variety of ways to encourage more people to learn SAP, including quadrupling the number of universities offering courses in the product.

Many of the hot SAP skills in the 1990s were related to technical implementations, while today's in-demand specialists are more on the functional, business side. Companies need people with interpersonal skills who can help work with and explain the SAP functionality. This is a trend not only for SAP professionals, but technologists in general. In an August article, Stacy Collett wrote:

"The most sought-after corporate IT workers in 2010 may be those with no deep-seated technical skills at all. The nuts-and-bolts programming and easy-to-document support jobs will have all gone to third-party providers in the U.S. or abroad. Instead, IT departments will be populated with 'versatilists' -- those with a technology background who also know the business sector inside and out, can architect and carry out IT plans that will add business value, and can cultivate relationships both inside and outside the company."

You can have a hot job working in the software industry or in the kitchen of a McDonald's (I've done both). Your wage for doing either of these jobs will follow the same simple principle: You will be paid based on how difficult it is for the employer to replace you.

If the quick service restaurant can train somebody new in an hour to do your job flipping burgers, then you will always only make minimum wage. However, if McDonald's has to train somebody for months to come up to speed on their SAP environment, then they will pay you a premium for that expertise.

Sunday, October 12, 2008

Leave a Legacy

Despite living in the 21st century, companies today still use a variety of legacy reporting tools created in the 1970s and 1980s. Early software vendors created products such as the 4GLs (FOCUS, NOMAD, and RAMIS), QMF, and DYL280 for use on specific host computers, typically a mainframe platform such as MVS/TSO or VM/CMS. They of course designed these tools according to and limited by the capabilities of that decade's technology.

Companies had to train and support reporting end users to log into the mainframe environment and use the command-line processes and menu screens intended for data processing professionals.

Mainframe tools naturally lacked graphical user interfaces (GUIs) and instead used character-based “green screens”. Because the mainframe environment did not communicate with end-user tools such as Microsoft Office, any integration with spreadsheets, documents, or presentation slides was accomplished manually. Likewise, only that data on the mainframe platform could be accessed (e.g., typical data sources include sequential files, tape cartridges, VSAM, IMS, DB2, and others).

The new web-based BI products, on the other hand, can be architected as multi-tier environments, separating the user experience from the backend data sources. Just because the data and access mechanisms are on a mainframe does not mean that the user must use a dumb, green-screen terminal. Instead, the user can interact with the BI engine via a web tier and a simple GUI web browser. The BI engine can automatically pass the formatted reports back to the user in the desired output formats, such as HTML, Microsoft Word and Excel, PDF, and others.

The complexity of using a mainframe operating system is now hidden from the end user and performed automatically by the BI system. In addition to accessing the mainframe legacy data, the BI engine is now typically able to reach a wide range of enterprise data on multiple platforms. Relational database management systems (e.g., Oracle, UDB, Sybase, Informix, and MySQL), multidimensional cubes (e.g., Essbase), and other non-legacy sources can now be used in BI requests. In addition to data, modern BI engines are also able to communicate with enterprise packages (e.g., PeopleSoft, SAP, and Oracle) to easily obtain application data.

With the proper web security in place, enterprise data can now be shared easily with clients, suppliers, business partners, and others who were outside of the mainframe environment. Reporting tools are now online at a corporate website for the proper individuals, enabling them to perform business intelligence as easy as ordering a book from Amazon.

In 2008, it has become increasingly hard for a firm to support a reporting tool that was installed before 1988. One reason is the dearth of knowledgeable individuals in old BI technologies such as FOCUS, RAMIS, and NOMAD. Take COBOL as an example of a technology that is still in widespread usage in businesses today. Despite the need, few colleges teach COBOL to their students. If universities no longer teach a critical business programming language, how can we expect them to instruct on an obscure 4GL like RAMIS?

If a company still uses NOMAD to run mission-critical human resource and payroll functions on the mainframe, will they really be able to hire somebody to replace Harold, the life-long 4GL programmer, when he retires next year? This is a significant business problem for companies using legacy reporting tools.

Some products have been traded among vendors like bent Topps baseball cards of little-known right fielders. NOMAD and SQR are good examples. RAMIS and others were acquired by Computer Associates. The original creator of the FOCUS 4GL still owns the technology, but they have effectively replaced it with a web version. The legacy products may get a frequent white-washing and some new features painted on occasionally, but they still fail to provide today's organizations with current BI functionality.

You may already know this dirty little secret: the vendors of legacy BI tools do not really sell these products anymore. Nobody today buys a reporting language whose integrated development environment consists solely of a text editor. Instead, these vendors hope their legacy customers will continue to hold onto these cash cows so they can be milked for annual license fees, which is typically 15% to 20% of the purchase price.

I worked with a Fortune 50 financial services firm that had installed NOMAD in the late 1980s. At one point this company had over 500 4GL users, but after 20 years the usage had slowly declined to about 100 people. Few of these individuals were the original developers; they had merely inherited programs when earlier workers had left. Few people knew the actual workings of these important business processes. Few were excited about having to do this support work.

Manually converting these applications posed a major obstacle to this organization. In order to justify moving to newer BI technology, we had to reduce the time, cost, and skill requirements by building an automated translation process.

Another Fortune 50 client has over 800 business users supporting FOCUS applications and they have struggled for a decade trying to manually replace this legacy tool with a Business Objects product. A large Fortune 500 insurance company has a similar situation with over 500 FOCUS users. Decisions come down from on high saying, "It's just a reporting tool. Throw it out and replace it with Business Objects." Sorry, it's not that easy.

I worked with an IT group who had been given the "just replace it" directive. Unbeknowst to them, savvy users in remote manufacturing locations had used the 4GL to build complex, mission-critical processes to schedule their shop floor operations. Oops, who saw that coming? After all, it was just an end-user reporting tool. End users shouldn't do that kind of thing.

Over a 20-year period, legacy end-user reporting tools like FOCUS spread through organizations like a unchecked virus. Early MIS departments nonchalantly turned over reporting responsibilities to business developers. Management did not pay attention to how widespread the usage was. Now, decision makers find themselves ill and seeking powerful legacy antibiotics, as it could cost them millions of dollars to treat their disease and manually replace user-written applications with new products in order to get to a healthy state where they can meet today's challenging business demands.

If your company still depends on a legacy reporting tool, you have valid reasons to be worried. We are trying to cope with global competition, rapidly changing technologies, and shaky financial situations. We are in a consolidating BI market that contains leader products, emerging new technology, and declining older tools. It is definitely time to leave your legacy.

Friday, October 10, 2008

Actuate's Pay-to-Play Deal

Actuate Corporation recently announced a partnership opportunity for other software vendors to embed the Actuate BI product into their applications and resell it as their own.

OEM deals can be great for the original vendor. Just look at Oracle, whose relational database management system was embedded with large applications in the 1980s and 1990s. If vendors could get success like Oracle's through these partnerships, you would think that they would be begging other companies to resell their products.

With that in mind, consider the following details of Actuate's OEM partnership offer:

"'Actuate’s new OEM Quick Start Program provides our ISV partners with the software and services they need to enhance their applications and drive new sources of revenue,' said Nobby Akiha, senior vice president of Marketing at Actuate. 'With our Rapid-Time-to-Market methodology for addressing our partners’
development, marketing and sales challenges, they can reap the full benefits of Actuate-based applications and pursue the rewards of the increasingly heated Rich Internet Applications market.'

Additionally, the OEM Quick Start Program provides superior design environments for creating new products with increased performance, reduced development time, improved end-user experience and increased profit margins. Actuate partners also receive support and development platforms, providing a fast, effective means to deploy Actuate enhanced applications to customer communities of any size and add capacity easily and cost-effectively as their needs grow.

The OEM Quick Start Program provides participating partners with a kit that includes the following:

  • Development licenses for Actuate BIRT-based environments for up to five users on a single server

  • One online training course for one student

  • One day of application installation and setup support

  • Four days of consulting with an Actuate Professional Services expert (does not include travel and associated expenses)

  • High-quality and responsive support via phone and email
    throughout the development and prototyping process

  • Significant discounts on deployment software when the Actuate-based solution is ready to go to market

The Actuate Quick Start OEM Program is immediately available and priced at $35,000."

For 35 grand (plus expenses, of course), you can get copies of open-source BIRT software, a single person trained for one day, one day of installation assistance, and four days with a high-paid advisor flown in from a far-away locale. When you are ready to sell your application to customers, Actuate will give you some discounts, because you evidently have to pay for even more software licenses at that time.

That's a hefty chunk of change to get the rights to resell Actuate. If application vendors have to incur this type of cost during the upfront development effort and then pay for additional licenses each time they sell their product, how can they be profitable? Well, they try to pass along the cost to the customer and inflate their asking price. But I wonder how long customers will continue to be willing to pay high costs for software applications, especially those made with open-source BI tools.

Back at the end of September, Christopher Dawson posted a ZDNet blog entry on how he looked into buying SAS to do statistical analysis for his local school district in Massachusetts. Christopher choked on SAS's quote of $5,000.

Having worked for software vendors most of my career, I can pretty much guess that the SAS sales rep thought he was being extremely nice to Christopher and basically giving away the product for an unrealistically low dollar amount. But no, Christopher thought that was too expensive and decided instead to try to use the open-source R statistics package.

Of course, Christopher is with a small, non-profit educational organization. His story may not represent a real trend for purchasing BI software. Perhaps the mega-companies are still willing to spend big bucks for BI software products.

What's this? Today, GM is selling for under $5 per share? Ford Motor is selling for $2 per share? P&G's value just lost $10 per share? Xerox just lost half its market value? Now, that is the trend making the news. That will change the way BI software is purchased.

Maybe Actuate's offer is more of an "Empty Your Pockets" deal, where they are looking for cash. Hopefully, those vendors who want to make an investment to play with Actuate will still be able to cover the $35,000 expense. Even more, I hope they will still have customers able to buy their finished application.

BI Consultants

Occasionally, I speak with individuals who think that using outside consultants for Business Intelligence projects is a luxury and only feasible for those large organizations with substantial budgets. On the contrary, consultants often provide the most cost-effective way to achieve your BI goals.

One reason is that BI consultants are specialists with solid training and expertise in BI products. They have probably dedicated years of their careers to specific products, such as Cognos, Business Objects, WebFOCUS, SAS, Microstrategy, or Actuate. They hit the ground running and accomplish your goals efficiently.

In addition to experience and skills, an outside consultant has the time to address your BI needs. While your staff must handle day-to-day operations and put off strategic initiatives, an external consultant can focus strictly on the work at hand.

Consultants are a flexible resource for your projects; they are temporary, come in only when you need them, and leave when they are done. They can also provide your organization with a fresh, objective point-of-view. Because the typical consultant has worked with a variety of clients, he or she has seen aspects of BI with which your staff may be unfamiliar. While onsite, the consultant can transfer some of this knowledge to your associates, leaving you with a better team.

Value-added features such as these make BI consultants effective and efficient. Having performed this type of work before and being dedicated to your assignment, consultants can achieve results more quickly than your internal resources.

At my consulting organization, we have developed software utilities to automate much of the manual effort for BI activities (e.g., automatically translating legacy programs and generating the modern BI code). We put the know-how of doing the BI work into an application, which means less time, risk, and cost.

Armed with knowledge, expertise, and tools, outside BI consultants may actually provide a lower-cost option than doing the work yourself.

Thursday, October 9, 2008

Psst: SAP buying Teradata. Pass it on...

The New York Times posted some speculation today that German megavendor SAP might be considering yet another BI company to go along with its recent purchase of Business Objects.

See Ashlee Vance's blog in the October 9th BITS section (Business, Innovation, Technology, and Society). Now somebody is really climbing the laddder of inference here if the only fact upon which this rumor is based is the upcoming retirement of SAP's Klaus Kreplin. More than likely, there is also an inside leak happening here.

Somebody is guessing that SAP wants to buy a data warehouse vendor such as Teradata or Netezza.

Teradata people in Dayton, Ohio, are sure to be nervous about this. Many are just now regaining their balances after the 2007 spin-off of Teradata from NCR. Their big concern then was that Teradata might leave the Midwest and send employees to a development site on the West Coast. But hey, it's time for a new perspective: El Segundo is a lot closer to family than Waldorf.

This is a logical next step for SAP/Business Objects in order to have a full stack of BI offerings. Today, they can provide BI for their SAP operational and Business Warehouse products, but they probably want to expand into more generic BI data warehouses; purchasing a vendor like Teradata or Netezza makes sense.

If SAP does get Teradata, watch for them to try to swap out competitive databases (e.g., those from IBM and Oracle) at their customer sites.

Saturday, October 4, 2008

The Debate Continues

In his September 2008 article for B-Eye Network, Bill Inmon commented on a study which found that applications storing BI data in Data Marts typically only have a life span of about 18 months. Bill wondered why this might be true and posited the following answers:

  • The traditional data model used for Data Marts (dimensional star schema) is "not conducive to change." Bill says that "the star schema is good for static requirements, not fluid requirements." So you build it for a specific purpose and then find out a year later that it doesn't meet your needs anymore.
  • The traditional tool used for Data Marts (OLAP) is not conducive to change. For slicing-and-dicing of data, the front-end tool and the back-end data model go hand in hand. If the data model no longer meets your needs, the GUI presentation will not either.
  • Business requirements are changing. If you have a inflexible data model, it is not going to meet your needs in the future.
  • Data Marts tend to be departmental solutions for a small pocket of users. Others in the organization may not know about these applications, which leads to their decline and disuse.

"There are probably are plenty of other reasons why data marts have such short lives. And interestingly, ALL these reasons are at play at the same time. It is not just one factor that causes a data mart to go into disuse. Instead, it is ALL of these factors working at the same time.

The net result of the fast expiration of data marts is that data marts start to accumulate in the corporation in large numbers. First, there are four or five data marts. Then, there are 50 or 60 of them. Then, there are hundreds of them."

Of course, this is just a continuation of Bill's long-running "Data Warehouse versus Data Mart" debate with Ralph Kimball. These guys don't draw quite the attention of other debaters such as Biden and Palin or Obama and McCain, but Inmon and Kimball have been going at it for years.

The back-and-forth is often entertaining. One of my favorite discourses was when Ralph said that a Data Warehouse was just a collection of Data Marts. Bill quipped back that you can collect all the minnows in the sea, put them together, and still not have a whale.

Inmon's concept is to create a very flexible data model of enterprise data available for any business intelligence need. Bill considers the data warehouse to have characteristics such as being integrated, serving all people, having a broad audience, and providing all data details in a relational form for flexible reporting. On the other hand, Kimbill's ideas are more practical, in that you create small collections of data for a specific business purpose and group of individuals. Bill pretty much agrees with the definition, saying that the data mart serves a smaller group of people and provides only summary data in a less flexible structure, such as the star schema. He just doesn't like that approach.

That is why Bill is quick to point out that with all the work you put into a BI application with a Data Mart, it is only going to last a year or so. Now multiple this times the number of different Data Marts you have within your organization and Bill thinks you will slap yourself on the forehead and say, "Wow, I coulda had a DW!"

Thursday, October 2, 2008

BAM BI 3: The Blog

Seth Grimes expanded his "Is BAM BI?" question from within LinkedIn and wrote about it in his blog, summarizing the different viewpoints. Read more about it here.

About Me

My photo

I am a project-based software consultant, specializing in automating transitions from legacy reporting applications into modern BI/Analytics to leverage Social, Cloud, Mobile, Big Data, Visualizations, and Predictive Analytics using Information Builders' WebFOCUS. Based on scores of successful engagements, I have assembled proven Best Practice methodologies, software tools, and templates.

I have been blessed to work with innovators from firms such as: Ford, FedEx, Procter & Gamble, Nationwide, The Wendy's Company, The Kroger Co., JPMorgan Chase, MasterCard, Bank of America Merrill Lynch, Siemens, American Express, and others.

I was educated at Valparaiso University and the University of Cincinnati, where I graduated summa cum laude. In 1990, I joined Information Builders and for over a dozen years served in regional pre- and post-sales technical leadership roles. Also, for several years I led the US technical services teams within Cincom Systems' ERP software product group and the Midwest custom software services arm of Xerox.

Since 2007, I have provided enterprise BI services such as: strategic advice; architecture, design, and software application development of intelligence systems (interactive dashboards and mobile); data warehousing; and automated modernization of legacy reporting. My experience with BI products include WebFOCUS (vendor certified expert), R, SAP Business Objects (WebI, Crystal Reports), Tableau, and others.