4 min read

Recovered Paper and Packaging Market Data

Those who sell recovered fiber—old corrugated shipping containers (OCC) or sorted office paper (SOP), for instance—struggle to find good quality sources of data. Unanswerable questions abound:
  • What is the size of the OCC or sorted office paper market in my area?
  • Where does the OCC that leaves the back of my store go? Where does the paper that members of my community recycle go?
  • How much is this fiber that is recovered from waste streams worth? Where can I find the true market prices that I can rely on to make sure that I am returning value (whether through revenue generation or cost savings) to my company or my community?

Historically, answers to questions like these have not been easy to develop because the market information available has been subpar.

Issues with Market Data

The recovered fiber industry is a transaction-intensive industry. Every day, thousands of industry players sell and buy multiple loads and various grades of recovered fiber. This is one reason that acquiring accurate, near-time data isn’t easy. To be confident in any quantification of the market, highly accurate and reliable market information needs to be based, not on a small sampling of transactions, but on a large number of transactions representing both sales and purchases.

Today, this volume of data is what is known as “big data,” a term applied to datasets that are so large that they are beyond the ability of commonly used software tools to capture, manage, and process them within a reasonable timeframe. Managing transaction data at this level of detail requires special tools and abilities: advanced database software, rigorous security, deep analytical expertise and an abiding commitment to business and industry improvement. The vast majority of recovered fiber sellers and buyers don’t have the time or resources to gather and aggregate this volume of data on a monthly basis so that they can better understand the market.

The ability to find accurate data describing the market is not the industry’s only problem. Industry-wide, concerns about the credibility of the available numbers are rampant:

  • Fear that survey respondents are reporting only the highest prices they receive if they are sellers or that only lowest prices they paid if they are buyers. When reporting processes allow for this type of manipulation, the report itself might include pricing, but it won’t be market pricing.
  • Recent market volatility has convinced some that prices are influenced more by rumors or other types of sentiment wending its way through the grapevine at least as much as they are by what is actually occurring in the market.

Every entity, from a major retail organization to a county or city governing body, requires accurate data to make sound business decisions. When the data being used to base a decision on is potentially manipulated by those responding to survey questions about the price they received or paid for their OCC or SOP, however, decision makers have less confidence in their decisions.

Solutions: 7 Tests for Data Quality

For the last 14 years, Forest2Market has been collecting, aggregating, reporting and analyzing data about market prices for commodity and other products originating in the forest. Because our business is based on data, we’ve spent considerable time and resources understanding the role of quality data in making business decisions.

Just how important is data? For a recent study conducted by researchers at MIT, the University of Pennsylvania and the National Bureau of Economic Research looked specifically at the performance of 179 companies and found that the primary distinction between entities with higher performance rankings and the rest was whether they make decisions based on data and analysis or on the traditional method of experience and intuition.

The study found those adopting “data-driven decisionmaking have output and productivity that is 5-6% higher than what would be expected” given other factors. Furthermore,” the study suggests, “the relationship between data-driven decisionmaking and performance also appears in other performance measures such as asset utilization, return on equity and market value.”

How much of a difference can a 5-6 percent increase in output and productivity have for the average retail company that recycles its shipping containers or even a city recycling program? The study’s lead  author, Erik Brynjolfsson, an economist at the Sloan School of Management at MIT, noted that a 5-6 percent increase “in output and productivity is significant enough to separate winners from losers in most industries. The companies that are guided by data analysis are ‘harbingers of a trend in how managers make decisions. And it has huge implications for competitiveness and growth.’”

The challenge for decision makers these days, however, is determining which data sets are high quality enough to produce better decisions and therefore results. As Lohr writes: we are “swimming, if not drowning, in wave after wave of data. . . . Internet-era technologies, by one estimate, are doubling the quantity of business data every 1.2 years.” In forestry-related industries specifically, the challenge is often finding accurate and therefore reliable data.

Ensuring that the data you use to make decisions is the highest quality available isn’t easy. But the results of our research and experience in the data business provide insight into the questions that data providers should be able to answer. The following tests can be used to evaluate the quality of the data sets being considered.

1. How is the data collected? Data collected by survey can be manipulated. When data is collected on a transaction-by-transaction basis, directly from contracts, scale tickets, orders or invoices, however, gaming the system is difficult.

2. Is the sampling size broad enough to be representative of the market? The more data your provider collects, the better. A true reflection of what is happening in the market can only be gleaned from data that is statistically significant.

3. Is the database deep enough that it can be sliced and diced in ways that allow for more accurate readings of the market? A provider that gathers data on a transaction-by-transaction basis and tells you the volume weighted average price, whether more or less volume was traded in any given month, and the extent of the variability in prices during the month will give you a much more granular insight into the actual market.

4. Is pricing data determined based on weighted averages? Does it recognize volume discounts? A data provider that collects price by volume is more representative of the market.

Example: 1,000 tons of OCC sold for $158 per ton and 100 tons sold for $126.

  • Many providers report average price: $158 + $126 = $284/2 = $142 per ton
  • Others report a weighted average price, which is more reflective of the market: $158*1,000 + $126*100)/1,100 = $170,600/1,100 = $155 per ton.
  • The difference in revenue or cost savings of $13 per ton adds up: if you sell 1,000 tons of OCC per year, your gain would be $13,000, and if you sell 10,000 tons, that amount would be $130,000.

5. Is the data expertized? Data that is reviewed by experienced market analysts is more likely to be scrubbed to remove outliers, as well as questionable and incomplete data. An exacting level of attention to detail will guarantee that the data accurately describes the market.

6. Are data submissions auditable? An additional layer of accuracy is guaranteed when all data is subject to audits.

7. Are reports customizable to specific business needs? Does the data provider work with clients to design just the reports and analysis that they need?