Companies launch initiatives to upgrade or improve their sales & operations planning and demand planning processes all the time. Many of these initiatives fail to deliver the results they should. Has your forecasting function fallen short of expectations? Do you struggle with “best practices” that seem incapable of producing accurate results?
For ten years, the editorial team at Foresight: The International Journal of Applied Forecasting has been telling readers about the struggles and successes of forecasting professionals and doing all we can to educate them about methods and practices that really work. We do that with articles contributed by forecasting professionals as well as respected academics and authors of highly-regarded books.
As Founding Editor of Foresight, I’d like to invite you to join us for the upcoming Foresight Practitioner Conference entitled “Worst Practices in Forecasting: Today’s Mistakes to Tomorrow’s Breakthroughs.”
This 1.5-day event will take place in Raleigh, North Carolina, October 5-6. There we will take a hard look at common practices that may be inhibiting efforts to build better forecasts. Our invited speakers will share how they and others have uncovered and eliminated bad habits and worst practices in their organizations for dramatic improvements in forecasting performance.
Some of the topics to be addressed include:
• Use and Abuse of Judgmental Overrides
• Avoiding Dangers in Sales Force Input to Forecasts
• Improper Practices in Inventory Optimization
• Pitfalls in Forecast Accuracy Measurement
• Worst Practices in S&OP and Demand Planning
• Worst Practices in Forecasting Software Implementation
Foresight is published by the non-profit International Institute of Forecasters (IIF), an unbiased, non-commercial organization, dedicated to the generation, distribution and use of knowledge on forecasting in a wide range of fields. (Smart Software’s own Tom Willemain serves on Foresight’s Advisory Board.) Foresight is just one of the resources made available by the IIF. Additional publications, a host of online resources, an annual symposium and periodic workshops and conferences are available to all IIF members. The Smart Forecaster previously interviewed IIF past-president Dr. Mohsen Hamoudia. Visit the IIF site for information about joining.
(Len Tashman is the editor of Foresight: The International Journal of Applied Forecasting. The unusual practice-related conference he describes, upcoming in October 2016, will appeal to many of readers of The Smart Forecaster. For instance, those who have received Smart Software’s training have been alerted to the possibility that overriding statistical forecasts can backfire if done cavalierly. Two sessions at the conference focus on the use of judgement in the forecasting process. — Tom Willemain)
Related Posts

You Need to Team up with the Algorithms
This article is about the real power that comes from the collaboration between you and our software that happens at your fingertips. We often write about the software itself and what goes on “under the hood”. This time, the subject is how you should best team up with the software.

Rethinking forecast accuracy: A shift from accuracy to error metrics
Measuring the accuracy of forecasts is an undeniably important part of the demand planning process. This forecasting scorecard could be built based on one of two contrasting viewpoints for computing metrics. The error viewpoint asks, “how far was the forecast from the actual?” The accuracy viewpoint asks, “how close was the forecast to the actual?” Both are valid, but error metrics provide more information.

Every Forecasting Model is Good for What it is Designed for
With so much hype around new Machine Learning (ML) and probabilistic forecasting methods, the traditional “extrapolative” or “time series” statistical forecasting methods seem to be getting the cold shoulder. However, it is worth remembering that these traditional techniques (such as single and double exponential smoothing, linear and simple moving averaging, and Winters models for seasonal items) often work quite well for higher volume data. Every method is good for what it was designed to do. Just apply each appropriately, as in don’t bring a knife to a gunfight and don’t use a jackhammer when a simple hand hammer will do.