October 3, 2014

Using Data Analytics to Optimize Revenue and Enrollment

Ryan Chan

It’s no secret that the expenses associated with higher education extend well beyond semester tuition fees. As an increasing number of reports and articles point to ballooning educational costs and decreasing job opportunities for graduates, universities and private colleges across the country are scrambling to find new methods by which they can convince students to invest in an undergraduate or graduate degree with them. Academic leaders who are committed to finding long-term solutions for admissions-related struggles must first evaluate the various issues surrounding a student’s decision to enroll and how they can better convince students that the money they are willing to invest is well-spent.

Evaluating Current Trends in Matriculation

A quick analysis of current statistics and general sentiment will reveal that public opinion regarding the inherent value of education has changed significantly since the onset of the 2008 financial crisis. Whereas certain areas of the country, such as the Midwest, are experiencing an increase in the number of students completing their high school diploma, many of the more densely populated cities and states, such as California, Massachusetts and New York, are experiencing little, if any, current or forecasted rise in matriculation in the upcoming decade.

For those who do graduate high school, the college application process has grown significantly more intensive. On average, students are applying to more higher education institutions than ever before, primarily in an attempt to offset the increased competition amongst those trying to gain admittance to more prestigious colleges.

Changing Public Sentiment

Cynicism regarding the merits of higher education is at an all-time high. Although a variety of financial resources are available for those hoping to enroll in college, such as government-backed and private student loans, the stigma that has been placed upon borrowing has led to many students forsaking enrollment opportunities in order to avoid financial debt.

“Cost-awareness” is becoming increasingly prevalent in both the families of individuals attending college and the students themselves. Studies have shown that the number of students applying for need-based aid is increasing, which could be due to the fact that families are generally choosing to spend less on college-related expenses now than they were in 2009. Additionally, it is important to note that only 57% of students accepted admission offers from their “first-choice” school in 2013, largely due to the fact that they believed the costs of their education would be too high.

Discounts and Funding

Although the number of institutions now offering increased tuition discounts in a bid to lure new students has increased, these tactics have resulted in little, if any, increases in enrollment. In fact, some institutions have experienced significant decreases in enrollment figures following substantive tuition discounts, an observation that has left those in academic leadership roles throughout the country at a loss for what the “correct” step may be to bolster attendance. These circumstances have resulted in both Moody’s and S&P releasing ‘negative’ forecasts for higher education. As a general observation, it could be stated that students are now placing increasing emphasis on what academic and professional resources are available for them at a given college or university rather than just focusing on its reputation alone.

A Data-First Approach To Higher Education Enrollment

Although data analytics professionals may dismiss any information associated with students who opted not to enroll, it is important to take stock of the type of aid offered to them and how these individual decisions are affecting the net total revenue (NTR) of the institution at large. For example, in situations where fewer students opted to enroll and yet were, on average, paying more for their education, the NTR of the academic institution may be higher than in situations where an increased number of students were admitted at markedly reduced tuition rates.

Careful analysis of the relationship between offered tuition discounts, source of the discount (i.e. which department is compensating for the student’s reduced tuition), as well as the final outcome of the offer (whether or not the student decided to enroll), will allow institutions to refine their predictive modeling and develop an accurate yield analysis that will be invaluable when generating future admission strategies.

Additionally, institutions should pay careful attention to situations where an initial aid offered was appealed by either the student or their parents, and what the final fiscal outcome of the appeal was (i.e. if a larger discount was then offered to the student, and what additional funds were needed to convince them to enroll).

Ultimately, predictive modeling strategies such as these will allow institutions to accurately answer some of the most important questions affecting their NTR, including whether or not the discounts they are offering to students are hindering or nurturing the growth of the university and college and, thus, its ability to further maximize “bang-for-buck” potential for future students.


Essentially, the success of an academic institution, particularly in an era such as this where state-funding is dramatically being reduced, is largely dependent upon whether or not institutional leadership adopt the same tactics and strategies used by private business owners today. Data collection and financial planning and analysis can provide a degree of insight that may make the difference between success and failure.

September 19, 2014

The Power of Predictive Analytics

Caroline Japic

Imagine that on your drive to work tomorrow morning, you spend your entire commute looking in the rearview mirror. Sound like a good idea? How far would you get before something serious – even catastrophic – happens?

It’s unthinkable, I know. But incredibly, companies do this every day. They depend too heavily on a historic view of their business to predict what might happen next.  In today’s business environment – where markets turn in an instant and customer loyalty is fleeting – that’s a recipe for disaster.

Just ask Walgreen’s, which last month shook up its executive ranks after disclosing that its previously publicized earnings forecast was inaccurate – by about $1 billion. Not all forecast errors are so enormous (or their repercussions so painfully public). But there’s no question that in an age where every company is awash in real-time data, successful businesses will find ways to combine that always-current data with analytics technology to come up with forecasts they can count on. By this I mean forecasts built around the factors that drive their business and the risks that could affect it, models that incorporate not only performance data but the crucial context that gives it meaning, and intuitively run what-if scenarios that let decision-makers understand what the impact of their actions will be.

At Tidemark, we like to summarize that concept this way: Run in the Now, Impact the Future.

In fact, relying on predictive business analytics (PBA) is fast emerging as a defining characteristic of market leaders, with Gartner estimating that companies who implement predictive business performance metrics will increase their profitability 20 percent by 2017.

It’s an important enough topic that we’re hosting a free webinar with Proformative on Wednesday, Sept. 24, from 11 am to noon Pacific. The webinar will feature author and business performance guru Larry Maisel, whose consulting firm DecisionVu has helped Ford, Citigroup, NBC Universal, Pfizer and other Fortune 1000 companies develop and implement strategies that increase operating performance and improve shareholder value.

“Business Analytics, Forecasting, Financial Planning: The Recipe for Impacting Performance Across the Enterprise” will provide attendees with a better understanding of why and how to implement PBA.  Maisel will draw on his experience and groundbreaking thought leadership in the field to outline the steps every company must take to make predictive analytics part of their forecasting and planning processes, including defining guiding principles for BPA, adopting and developing a PBA framework, and establishing a monthly operating review to consistently measure and monitor the metrics that matter.

Maisel will also explore how these PBA best practices are being integrated with financial planning and advanced analytics platforms to improve performance across an entire enterprise. In addition, these concepts will come to life when Maisel shares a case study detailing his work with MetLife.

After attending this free webinar, participants will:

  • Understand the concepts and inter-relationships of business analytics, dynamic planning and forecasting, and employee performance management
  • Discover how to transition from a static to dynamic planning and forecasting environment that links strategy to operations
  • Identify opportunities to develop and leverage analytics that deliver timely, relevant and accurate to those who impact financial and operational performance

One CPE Credit (Finance, Basic) is available for webinar participants. And as a special thank-you from Tidemark, webinar attendees whose questions are chosen* for the Q&A portion will receive a free copy of Maisel’s latest book (with Gary Cokins), Predictive Business Analytics: Forward Looking Capabilities to Improve Business Performance.

Register today, and find out how your company can join the ranks of the industry leaders who forecast the future with predictive analytics – rather than a rearview mirror.

*If Q&A time runs short, webinar hosts will randomly select questions from all those submitted.

September 5, 2014

Time for a Finance Revolution

Colby Moosman

Boston knows a good revolution when it sees one. So there’s probably no better place for finance executives to learn how they can move their planning, forecasting and analytics from where they’ve been to where they need to go.

Next week, CFOs from around the country will muster in Boston at two Innovation Enterprise conferences – the FP&A Innovation Summit and the Finance Transformation Summit – to hear firsthand how their counterparts from companies like Walmart, TIAA-CREF, Kraft and Sony have revolutionized their own financial processes. They’ve freed themselves from the tyranny of spreadsheets and annual budgets and opted for flexible, real-time environments that enable continuous planning and extend FP&A beyond a few users in finance.

I’m excited to be speaking at both conferences Monday – the Finance Transformation Summit at 9:40 a.m. and the FP&A Innovation Summit at 4:10 p.m..

Modern FP&A is now critical for the ongoing and future health of businesses overall.  It’s a new era:  information moves faster than ever, and there’s more information available to be analyzed and applied — traditional FP&A platforms and processes weren’t built for this. Consider that the average financial close slogs on for 23 days, and that it takes the average organization six months before the impact of any given business decision is fully understood. What CFO wants to run a business on six-month old data?

In my talks on Monday, I’ll draw on real-world experiences by Tidemark customers to communicate a range of concepts and best practices that should be helpful to CFOs considering their next steps. But here’s a quick primer on the new set of business requirements for FP&A:

  • Get out of the spreadsheet jockey and cube business. Finance spends too much time managing spreadsheets and multi-dimensional cubes. To be competitive, they should be shifting valuable FP&A resources to analysis, assessing new revenue opportunities, and other activities that add value to the business.
  • Drive continuous business alignment. Finance is being asked to make sure corporate strategy and numbers are in total alignment – not just once a year but constantly.
  • Remove constraints on volume, type and timeliness of data.  Every business now runs on data, and CFOs need to find ways to consolidate and analyze the huge volumes of internal and external data that can inform decisions and forecasts.
  • Deeply engage at the edge. In a typical organization, only 11 percent of employees are in any way involved in FP&A.  That’s not nearly good enough if companies hope to respond at the pace of today’s business and operate with a new degree of precision. The need a way for decision-makers at every level to access real-time data and run the what-if scenarios that will show them the implications of their decisions.
  • Stop arguing about the numbers. It’s hard to work from a single set of numbers when FP&A processes are plagued by disjointed data silos and lag times. Finance executives need a modern system that and gets everyone working from the same set of real-time numbers.
  • Don’t accept tools that force you to conform. Historically, finance has had to bend the way it works to fit the demands of IT systems and Excel spreadsheets. It’s time for finance to make the demands – and to start using solutions designed around how finance works.

Modern FP&A has become a business imperative, and the list above sets a high bar for any FP&A solution. In Boston, I’ll be exchanging ideas with CFOs who may already be questioning whether their current processes can come anywhere near that bar. And I suspect many will be wondering if it’s time for a revolution.

August 29, 2014

But is it Built for the Cloud?

Ashmi Chokshi

The percentage of enterprises adopting cloud-based Software as a Service (Saas) solutions has grown five-fold in the past four years, according to new research from Northbridge Venture Partners and Gigaom Research. For software and infrastructure providers, all this boils down to money – and plenty of it. IHS projects businesses will spend $174 billion on cloud architecture and services this year, and $235 billion by 2017.

With so much at stake, it’s not hard to see why every software and services provider is fashioning its offerings as cloud-based, cloud-ready or cloud-friendly.

Enterprise decision-makers, though, would be wise to look closely when evaluating new solutions. They should consider what these phrases mean, and whether the solutions they’re assessing offer capabilities and benefits that only true cloud solutions (such as Workday, Okta, Tidemark and others) can deliver. Look closely enough, and you are likely to see where a “cloud-ready” solution may not be ready for the cloud at all.

A new paper available today – Five ways to identify a “true” cloud solution – can help you distinguish between premise-based platforms now sporting a few cobbled-in cloud characteristics and solutions actually architected to live in the cloud, and thus built to deliver cloud-only benefits.

What makes a true cloud-first solution? From the paper, here’s a quick rundown.

  1. Automatic, painless access to innovations. Old-school client software requires time-consuming upgrades, so new features are bundled into major releases to try and minimize the pain for administrators and users.   Cloud-first solutions can deliver new capabilities automatically, with no additional cost or time burdens placed on customers.
  2. Flexible configuration, not costly customization. Every enterprise deserves a solution that meets its needs and conforms to its workflow. What it doesn’t deserve is a huge consulting bill for months of meticulous customization that creates a platform so complex only the consultant understands how it works. A well-designed cloud-first solution enables self-service configuration that lets process owners preview and test an endless array of configuration changes before they’re incorporated into the application.
  3. Scalability. Businesses move critical functions to the cloud for a lot of reasons, but a primary driver is the ease with which they can scale up or down as their needs change – all without expensive changes to IT assets. It should be easy to add or remove users to manage cost and usage. No enterprise should settle for less.
  4. Cross-platform integration. Organizations are beefing up their forecasting models with all kinds of structured and unstructured data – and they don’t have months to devote to painstaking data integration.  Look for cloud-first solutions that are built to integrate with peripheral tools, and to accept data from a multitude of enterprise platforms and external sources. No organization today can afford to wait for the painstaking integration process that defined software in the pre-cloud era – and that continues to live on in many so-called cloud-ready platforms.
  5. Customer centricity. True cloud solution providers have to put customers first. Cloud customers make no significant on-premise IT investment, so there’s no reason for them to stick with a platform or vendor they’re not happy with. Once it’s time for customers to renew their subscription, they can simply move on. This is a powerful and persistent incentive for cloud providers to make their customers feel like VIPs every day. The best cloud providers understand this.

For more details and further tips on what to look for in a true cloud-first solution and why that matters, download our new paper today. I’m confident you’ll see how terms like cloud-based, cloud-ready and cloud-friendly can help you choose your next solution – though perhaps not in the way their providers have in mind.

August 21, 2014

Blowing Past Your Business Speed Limit

Hamish Nairn

What’s your business speed limit?

Oh, I’m sorry. Didn’t know you had one? But you do. You may not have noticed it before, but as the CFO, you see it whenever you request a new data view and it takes two, three, even four weeks for it to be created. Another clue is that your finance team includes a junior analyst whose full-time job is to pull, clean and analyze raw data out of your reporting system and into Excel – just to generate core financial metrics.

You spot it again when the CEO interrupts your latest presentation to the board with a helpful suggestion: “It would be great if we could see these sales metrics down to the store level after correcting for discounts.” Yes, you agree that it would be great, but creating it would require weeks of effort across the IT and finance teams. “That’s a great idea, and I’ve even asked for it before,” you respond. “But as I’ve learned, our data isn’t structured to give us those views on demand. Doing this would require IT to modify dimensions and rebuild the cube, which means we’d have to pull resources from other critical projects. So for the time being, we’ll probably have to keep working from the metrics we currently have.”

Clearly, there’s a limit to the speed of your business.  If you’re like most organizations, your reporting tools aren’t fast or flexible. They make changing your metrics a slow, difficult process requiring cross-team effort, so you postpone it or avoid it altogether. And management is forced to make decisions using data based on old business rules and outdated structures – like planning for the future by looking in your rearview mirror.

When you’re at the speed limit, business management is restricted. Inflexible data structures lead to inflexible business processes. It’s harder to identify trends, pinpoint risks and determine possible revenue plays.  It’s tougher to run in the now.

Meet the Three Speed Blockers

There are three major sources of business speed limits:

1: Inflexible technology. Your reporting tools are weighed down by the baggage of a legacy data environment. As your business evolves, it is very difficult to reconfigure the data in line with new structures. Changes to business rules, hierarchies or definitions generate large programs of work to reconstruct the data engine.  In fact, changes are often impossible due to brittle legacy architectures that can’t scale or bend and flex.  This becomes even more complicated if the processing capacity has memory or storage limitations.

2: Ownership conflicts. The CFO needs new data but the CIO controls it. It’s challenging to get new requests prioritized when the IT department is so busy just keeping the current data systems running. And within IT, changes to data reporting require cooperation across the data warehousing, data management, systems management and hardware teams, each with its own priorities and each with the potential to cause delays. Depending on IT’s backlog, projects that seem simple on their face can take months or years to complete.

3. Cost. Even when data reporting requests can be handled quickly and easily, changes still come with significant financial costs. Rebuilding data cubes, updating systems and conducting testing all require work. This generates labor expenses, impacts hardware performance and may be constrained by capacity or budget considerations.

These speed blockers reduce the agility of your business and increase the risk of management losing control over decision quality. The data and processes they create can’t keep up with a fast-moving business environment.

Some seek workarounds. For instance, analysts tired of waiting on change requests resort to manually calculating metrics from raw data in Excel models outside business systems. Over time, these models become cumbersome, error-ridden and stale. They are understood only by a small number of experts and pivot table gurus who can navigate the spreadsheets and the broken systems. These precious few hold the keys to visibility on business performance in the organization – a thought that should worry any CFO.

Working at the Speed of Business

The answer is to remove speed limits and eliminate the need for manual workarounds.  In our work with customers like Netflix, Cerner Health and Reddy Ice, we’re seeing organizations once buried under spreadsheets now accelerating their business by taking advantage of three key advantages to Tidemark’s cloud-first, mobile-based business planning and analytics software.

  • Tidemark provides data flexibility that matches the need for business flexibility. Users can quickly and easily view metrics across many dimensions. One tap on the tablet enables managers to drill into metrics or view a different dimension. Unshackled from Excel silos, multiple collaborators can participate in the same reports at the same time.
  • Tidemark is fast. The cloud-first computation engine delivers metrics instantly in your browser, with no error-prone analysts or manual workarounds involved. Everything and everyone uses the same, up-to-date system that is maintained by Tidemark. It’s always current, always available, and always fast.
  • Tidemark offers control over the data by managers who work at the front lines of the business. No more turf disputes between IT teams that hold up change requests. Modifications to definitions and hierarchies are straightforward, even for non-technical users. And because Tidemark hosts and maintains the system for you, you lower your ownership costs and have more IT resources to commit to other projects.

If the speed blockers mentioned above look familiar to you, perhaps it’s time to look seriously at a cloud-first solution that empowers the entire organization to manage performance, impact results, and equip managers to immediately understand the implications of their decisions.

It’s just what you need to blow past your business speed limit once and for all.