August 14, 2014

Analyze This: Introducing Tidemark Advanced Analytics

Christian Gheorghe

Most companies recognize the advantages of data-driven decision-making. And because they are the de facto stewards of corporate performance data, CFOs are increasingly viewed as strategic partners in management.

Unfortunately, existing planning and analysis tools are making it impossible for CFOs to fully live up to that role. The inherent limitations of these tools prevent them from pinpointing the key factors affecting their business, leave them unequipped to leverage advanced analytics that would improve their business decisions and the accuracy of their plans, and render them powerless to engage stakeholders beyond basic FP&A functions.

For today’s organizations – who can’t afford to operate based on an obsolete, rear view mirror view of their business — these shortcomings are simply unacceptable. This is why we are introducing Tidemark Advanced Analytics. This collection of innovative new features enables decision-makers to identify what factors impact their business, which in turn will help them narrow their focus in ways that lead to better decisions.

Tidemark Advanced Analytics handily addresses both sides of the data conundrum: business users harness the full potential of data, but they’re not overwhelmed by it. We achieve this by putting intuitive statistical analytics into the hands of every manager and stakeholder in the business – and in a way that lets them focus on outcomes, rather than on complex calculations. This leads to more accurate plans and risk-adjusted forecasts that incorporate the data most likely to influence outcomes. Meanwhile, managers can sift out confusing or misleading correlations that too often leave them drowning in data.

We designed Tidemark Advanced Analytics from the input and guidance we’ve received directly from customers. They told us that, to operate in the now, they need the ability to:

  • Quickly and easily identify business drivers and risks. Any manager can leverage our in-memory computational cloud to understand advanced statistical metrics, such as coefficient of correlation. That way, they can identify what is really driving the business and what poses the greatest risk.
  • Intuitively run scenario analyses and what-if modeling based on data context. Tidemark also integrates advanced, contextual analysis within recognized planning and forecasting processes, so managers can focus on the right factors at the right time and within the right process.  When everything is presented and analyzed in context, there’s less guesswork and confusion, and faster time to insight.
  • Democratize analytics to include more business users – not just the experts in finance. With Tidemark’s intuitive, consumer-like user experience, there’s no longer a need to depend solely on specialized data science expertise or hard-to-use statistical tools to incorporate the statistical insights gained from correlation and regression analysis. As a result, business users can more easily focus on the impact of a few key risks, resulting in plans they can act on with confidence.

This is what Tidemark Advanced Analytics delivers. Now business users across the enterprise – not just data scientists and IT experts – can harness advanced computation and statistical capabilities in the context of their financial and operational analytics. So whether you’re determining optimal pricing strategies, developing a hiring strategy that reduces churn, or identifying the most promising targets for a university fundraising campaign, Tidemark Advanced Analytics can help you achieve the outcomes you’re after.

An Industry First, Available Now

All this represents an industry first. No other provider has integrated these advanced statistical analytics capabilities within a financial and operational planning platform.  And true to our history, we’re not holding onto these breakthroughs until we roll out the next big release months from now. Because our cloud-first software applications are automatically updated, new features appear automatically and seamlessly, and our customers don’t have to slog through software upgrades and version management headaches.

We at Tidemark remain laser focused on advancing our core planning, forecasting and budgeting capabilities. So even as we introduce the groundbreaking stuff like Tidemark Advanced Analytics, our customers will continue to see us regularly refine, improve and extend the features they use day in and out.

Our customers never stop innovating, and neither will we.

August 6, 2014

Doing More With Less – Financial Planning for Higher Education

Ryan Chan

I think it’s fair to say that college students don’t put much thought to what it takes to keep their school running smoothly. They show up to school to take classes, enjoy a social life, explore opportunities, and graduate.

This certainly was true for me. I don’t blame my 18-year-old student self, because when you’re that age, you tend to be pretty self-obsessed. So when my school’s registration systems became overwhelmed, or when my Poli Sci final grade failed to appear on Blackboard the day after the exam, or when my scholarship funding took seemingly forever to be reflected in my tuition balance, it never really occurred to me that operating a college or university was exceptionally hard work.

It still is – in fact, it’s tougher than ever. And last week, I learned just how difficult things have gotten when I attended the Campus Technology 2014 Conference at the Hynes Convention Center in Boston. I learned that my perception of higher education over a decade ago couldn’t have been more naive or simplistic. After speaking with numerous higher ed professionals, I came to realize the vast resources required to run an entire university. And if this wasn’t already challenging enough, falling taxpayer support has led to subsidy decreases, tuition increases and greater fiscal pressures on institutions.

Stephen Rituper explains how Tidemark Financial Planning for Higher Education can help a Campus Tech 2014 attendee.

A common theme I heard at the convention was that colleges and universities are increasingly “doing more with less.” With rising tuition comes even greater expectations, yet schools must somehow deliver on these expectations by relying on static or shrinking overall revenues.

So how do higher ed business officers do more with less?

Most of those I spoke with felt they had no choice but to somehow work smarter — to eliminate activities that aren’t producing results and replicate or expand the ones that do. But doing so requires administrators to understand what’s really going on within the university by making optimal decisions based on real-world data. Typically universities use Excel for big data analysis, but a big complaint I heard at the conference was how difficult it is to identify certain line items within huge spreadsheets with dozens, even hundreds of tabs, and then try to use those spreadsheets to collaborate with colleagues.

Last year, Tidemark worked with Brown University to help them move from an annual planning process to a cloud-based, continuous planning environment. Brown’s new environment unifies the separate planning models and data repositories that previously existed across four different schools, giving budget owners and planners a single view of plans and analytics that can be shared across the university.

Brown University, University of Miami, Florida Atlantic University and the Association of American Medical Colleges choose Tidemark.Our work with Brown and other higher ed institutions helped us identify the different processes that higher education institutions go through in the planning process. Consequently, we developed Financial Planning for Higher Education, a packaged application designed specifically to reinvent strategic planning, revenue planning, personnel budgeting and control, departmental budgeting, capital planning and grant planning. Designed for use by any budget owner across the institution, Tidemark pushes planning and budgeting to the edges so department heads can understand and plan by funding course, grant, position, use, and more. And it uses unique features like Tidemark Storylines so a university’s financial story can be easily understood by provosts, donors, regulators and alumni.

Balancing higher expectations with shrinking incomes is no easy assignment. But Tidemark’s growing roll call of higher ed customers – including Brown, The University of Miami, Florida Atlantic University and the Association of American Medical Colleges –  can attest that “doing more with less” is suddenly getting a lot easier.

August 1, 2014

Too Important to Stay the Same

Caroline Japic

A defining characteristic of Tidemark customers is that they’re not interested in maintaining the status quo. They’re innovators by nature. It’s just who they are.

I just spent an entire day happily swimming in one of these pools of innovative thinking. The place was Kansas City, the headquarters of Cerner, a leader in the $66 billion global health IT solutions market. Cerner’s technology solutions and services are designed to help more than 9,300 healthcare facilities put information right where people need it most right at the moment they need it.  The company’s tagline sums up its obsession with innovation: Healthcare is too important to stay the same.

Well, we at Tidemark feel the same about financial and operational planning and analytics. And so, it turns out, does Cerner. But more on that later.

We were in Kansas City at Cerner’s Riverport Training Campus on the Missouri River, which is its own kind of reinvention. The campus is located in what was formerly Sam’s Town Casino—and the lights of the Steakhouse and Great Buffet still remain as a bright neon reminder.  This unique and fabulous facility is used for Cerner meetings and employee training sessions. This week it was the site of Cerner Datacon ‘14, a daylong immersion for roughly 500 of the company’s analysts and data scientists. Cerner hosts this annual internal confab because it’s very much a data-driven company, and because it wants the people who own and work with critical data to have access to Cerner’s founders and other company leaders.

Cerner’s Co-Founder and Vice Chairman Cliff Illig opened the conference with a powerful keynote about growing an enterprise. Illig traced his own story as owner of Sporting Kansas City, the American professional soccer club based in Kansas City, Kansas. The club is a member of the Eastern Conference of Major League Soccer and one of the ten charter clubs of MLS, having competed in the league since it began play in 1996. As a Soccer Mom myself, I was thrilled to hear how Illig has driven the success of SportingKC using the same principles used to grow Cerner. As an investor and business leader, Illig always looks for ROI. And last year his Club really delivered, winning the prestigious MLS Cup.

Our role at Cerner Datacon was to make sure the Cerner team realized they have access to a powerful, flexible and cloud-based computational platform that can help them get so much more from their data. Today, Cerner uses Tidemark software for capital planning and headcount planning – both pretty critical functions for a growing $2.9 billion global enterprise with more than 14,000 employees.  At Cerner, these processes were previously handled in Excel and locked into once-a-year schedules that left this fast-growing, data-driven enterprise working from information that often wasn’t current. With Tidemark, Cerner has a continuous forecasting environment where real-world data is updated in real time. Now that’s innovation.

To those 500 data pros in Kansas City this week, Tidemark offers ways to innovate even further – and to make the information they work with even more useful and actionable. My day at Cerner Datacon was filled with conversations around those possibilities, including:

  • Transforming information into knowledge and action
  • Understanding the revenue implications of changes to a product or service
  • Automatically generating the right data visualization for the audience at hand
  • Working with an analytics tool that doesn’t bury users in cells and confusing dashboards
  • More easily surfacing correlations and causations that help analysts accurately predict likely outcomes

Whether it’s healthcare or data analytics, Cerner understands when something is just too important to stay the same. And I can tell you it’s a thrill to go swimming in their pool of innovation.

July 24, 2014

Planning for Nightmare Scenarios – But Without the Nightmares

Caroline Japic

I recently came across a blog from CEB’s Peter Young, where he offers an excellent 10-step guide for scenario planning. Young’s idea is that, with the right methodology in place, it’s possible for companies to anticipate and blunt the damage caused by future calamities – and possibly even turn some disasters into revenue opportunities. To achieve this with some reliability, they need to combine some core best practices, a good dose of procedural structure, cross-disciplinary talent and data-driven models, and then apply all these to the process of predicting likely scenarios.

Scenario planning can be a nightmare

Young’s process is all about ensuring that your organization stays competitive in the face of potentially high-impact events. In FP&A, we talk about modeling what-if scenarios. In fact, it’s a fundamental capability of Tidemark’s software. But these are what-ifs on steroids – potential catastrophic events that could strangle your supply lines, scare off customers, or obliterate your competitive advantage.

Here’s my quick summary of Young’s list, though a more detailed version appears on his blog. Worth planning for, don’t you think?

  1. Formulate the most important problems you’re likely to face and the most crucial decisions you’ll need to make.
  2. Create a team representing the key disciplines across your company.
  3. Generate a list of major future drivers, or the events and influences likely to impact your business.
  4. Refine and rank these major driving forces.
  5. Establish the nature of the “alternate worlds” your company may face, including the logic and framework needed to test scenarios.
  6. Create a quantifiable model based on the scenario’s major forces.
  7. Gather data and quantify your assumptions, including historical data, internal structured data, and external unstructured data – everything you need to make assumptions that mean something.
  8. Run the model you built based on the what-if scenario framework you designed.
  9. Create data-driven narrative scenarios and outline your opportunities.
  10.  Present your results to senior management.

As Young points out, “Scenario planning can be complicated and resource intensive.” And if I were managing this process, I’d probably be losing some sleep over the steps involving wrangling and analyzing lots of data, then flowing that information into various useful scenarios, and finally finding a way to clearly communicate to decision-makers what these various “alternate worlds” might mean for your company.

And I’d get no sleep whatsoever if I was trying to get all this done by relying on spreadsheets or an old-school FP&A platform that affords little flexibility for on-the-fly scenario modeling. Just imagining it makes my head hurt.

The picture gets a lot brighter, however, once you factor in a modern business planning and analytics platform. Though it would streamline virtually every step in Young’s guide (not to mention FP&A overall), a solution like Tidemark would be a lifesaver in several key aspects of scenario planning, including:

  • Data gathering. Step 7 in Young’s scenario planning guide could trigger a nightmare for organizations whose internal data is stuck in silos, and whose legacy enterprise platforms impose rigid limits on how they import and use data – especially the unstructured stuff from external sources like social media networks, email and market trend info. Modern FP&A software is far more flexible. For instance, Tidemark’s architecture reorders the old “ETL” data integration process so incoming data is transformed only when business users need to access it. This makes it far easier to augment internal financial and operational data with the growing volumes of unstructured data that provides meaningful context to recognized performance metrics. The result is that your models for anticipating future events are based on more real-world data and fewer made-up assumptions that could ultimately be proven wrong in practice.
  • Analysis and modeling. A modern solution like Tidemark leverages the cloud as a computational platform. This means users access a computational grid that is free of cubes and that doesn’t limit data use by constraining volumes or dimensions. It also means data-driven decisions can be processed up to 10 times faster than typical cube-based approaches, which helps eliminate lag time and confusion over outdated information. So those complicated what-if scenarios don’t take forever to model.
  • Communicating scenarios and possible outcomes. An Excel spreadsheet or numbers-laden report might contain all the information senior executives need to anticipate how another Asian tsunami could constrict parts availability and threaten revenues, but getting to the point of insight requires a ton of heavy lifting for both presenter and audience. Tidemark innovations like Storylines and Playbooks use interactive infographics to represent complicated scenarios and potential opportunities in ways that anyone can understand – quickly and easily.

 

The best thing about modern, cloud-based FP&A solutions like Tidemark?  Scenario planning is just one use for them. They’re also instrumental in enabling a continuous planning environment that pushes budget and forecasts ownership (and participation) to every manager in your company. And the data that informs plans, budgets and forecasts is always current, so you’re never working off of outdated information.

And yes, they’ll also help remove the nightmares from your scenario planning. Because there are few things more valuable than a good night’s sleep.

July 15, 2014

Memo to Finance: Take a Run-Time, Real-Time Approach to Big Data

Christian Gheorghe

Stories about big data appear in The Wall Street Journal with some regularity. But it’s rare to see an article like “Big Data Chips Away at Costs,” which ran recently within WSJ’s CFO Journal, because it does such a great job describing how the proper use of big data can help CFOs improve financial performance by gaining a better understanding of business drivers and hidden costs.

The article is viewable only to subscribers so here’s a recap: CFOs are analyzing big data to determine what parts of the business aren’t working (GM crunched parts costs, labor trends and market predictions to figure out it had to pull out of the European auto market); trim capital spending (Planet Fitness uses data on guest traffic patterns to lengthen the life of treadmills and other capital equipment); meet peak demand without increasing labor costs (Lowes uses security cameras to track customers in stores so it can better understand when stores need more employees on the floor and when they need fewer), and free up cash (AT&T boosted free cash flow by a third, in part by using big data to pinpoint use cases where videoconferencing could replace costly in-person travel).

But as the article points out, some CFOs have concerns. There’s such a thing as too much information, they say. And because big data now comes from so many different sources, those who have been on the fence about it may stay there a while longer. In fact, 40 percent of CFOs surveyed late last year by American Express Global Corporate Payments said they had no plans to invest in big data initiatives over the next year.

To these CFOs, big data probably seems to be little more than a distraction – one they don’t have time for. And I’d argue they’re right, if their view of using big data is to gulp down great masses of information in the hopes of accidentally discovering a tasty morsel of insight.

Bring it Into Context

But if you bring big data into the context and financial and operational processes – to begin with an understanding of what you want from the data, and why it’s important – then you don’t have to swallow an ocean.

What the companies profiled in CFO Journal already know – and what reluctant CFOs might suspect – is  that big data alone has very little value without context. Context is the meaning that surrounds the data. It’s the “secret sauce” that helps separate the useless information, of which there is plenty, from the really useful stuff. It helps you separate the tiny needle from the big data haystack. Take the Lowes example. If you just looked at raw numbers of how many visitors a store gets in a day, well, that’s helpful. But if you take the Lowes approach and track how many shoppers physically visit which departments and when, then you have the context needed to understand why the paint department should have an extra staffer on hand between 10 am and 2 pm – when DIY moms like to shop for paint while their kids are in daycare.

Like big data, contextual information comes from a lot of places, including from people (such as staffing data and performance evaluations) and from collaboration (like social network activity and emails).

As reader comments to the CFO Journal article point out, making use of this data has in the past proven costly and time-consuming. Well, no kidding. As more and more data is brought into the business by applications, sensors, people and machines, understanding the financial and operational context of data becomes even more crucial. This is especially true because working with larger volumes makes us more susceptible to becoming fooled by randomness, such as false correlations that merely affirm our preconceived notions of what the data will tell us.

From Business Needs Come Meaningful Data

So how to find context in a big data world?  Start by looking at what the business actually needs.  What do business users want to learn from the source data? What are the business use cases that could benefit from the torrent of data in today’s digital economy?  What correlating activities and content will help answer these questions?

This has a natural “funneling” effect that speeds the analysis process and fits the way companies operate today. And when you consider that the typical enterprise application draws data from as many as 70 to 90 sources, it’s immediately clear that traditional ways of integrating data won’t work. Mapping all that data, from all those sources, will require a small army or a large fortune. In most organizations, you’ll get neither. So what do you do?

A smarter approach is to reorder the process for making data usable by your application. Today, most organizations extract huge volumes of data, then transform it into a format their application will recognize, and finally load it into their data warehouse. The modern, cloud-aware alternative is to defer the need to transform data until the moment business users need it. It’s the approach taken by customers of Tidemark’s cloud-first business planning and enterprise analytics solutions. They aren’t pursuing the old familiar ETL (extract, transform, load) approach because it’s inherently limited, takes months to complete, and is effectively frozen once you’re done.

A Run-Time, Real-Time World

Instead, our customers use big data in a way that’s more flexible and agile: Extract your files and data, and then load them into a secure, cloud-based storage and computation platform. Then when the application has to respond to a particular business user need and hence execute on the data, you transform it on demand. This approach – call it ELT – lets you load as much data as you want and then transform it as business users are asking the questions all that unstructured data will help them solve.

Our customers – innovators like Netflix and Brown University – have to succeed in a run-time, real-time world. They can’t predict where the next bit of contextual information will come from.  And they certainly can’t anticipate every possible way their application will use source data ahead of time. But by turning the transform step into an on-demand task, the traditional enterprise data integration stack becomes utterly transparent to end users.   An ugly and expensive manual process becomes a seamless, automated one.

In an increasingly cloud-to-cloud world, this is how enterprises will make use of big data. ELT will help business users not only cope with their growing big data haystack, but it will also help them probe for the right needles to find the right answers that will help propel their business.