A critical aspect of the financial health of solar leasing companies or other ‘third-party owners’ (TPOs) is the performance of their ‘fleet’ of installed PV. When a PV system is not operating at peak performance, it’s not producing as much power as expected, which negatively impacts the leaseholder’s bottom line.
This performance risk is one of the primary factors in the bankability of a leasing product and one of the principle hurdles to ‘securitization,’ which can significantly lower the cost of capital for the PV industry. Unfortunately, managing PV fleet performance is not as simple as one might assume, but there are solutions to help TPOs understand not only performance risk, but the root cause of performance problems.
Benchmark for optimal performance
Although PV systems have a long life (typically 25-30 years), performance can be lower than expected due to a variety of factors—from environmental effects such as soiling or shading, to hardware malfunctions such as a failed inverter. Often TPOs monitor the performance of the PV systems within their portfolios, but monitoring alone only tells one side of the story—how much a system is producing. In order to ensure the system is performing optimally, they also need to know how much it should be producing, from hour-to-hour, based on actual local weather conditions.
When armed with this ‘benchmark’ data, TPOs can compare actual production with simulated production, and use that information to identify and troubleshoot the causes of poor-performing systems. As a result, system owners can take corrective action when needed, such as rolling a truck to clean a system, correct shading obstructions, or replace hardware.
One benchmark solution currently available is SolarAnywhere® SystemCheck™. SystemCheck utilizes SolarAnywhere’s location-specific irradiance data and PV system specifications to produce power production estimates on an hourly basis. These production estimates can then be pulled into monitoring solutions such as the one offered by Deck Monitoring to provide owners with detailed performance comparisons. SystemCheck can also be used to analyze the performance of entire fleets of systems, a common scenario for TPOs.
Benchmark data is only as valuable as it is accurate, so validating accuracy is critical for data users. To answer the question ‘how accurate is SystemCheck,’ we recently completed some validation testing in coordination with a utility in California. In this six-month study, we compared measured hourly energy production data for more than 2,000 PV systems against historical performance simulations using SolarAnywhere.
This effort was undertaken to show the accuracy of SolarAnywhere irradiance data and simulation methods, and the performance was very good. For the subset of systems that we used for comparison, the Relative Mean Absolute Error (rMAE) was less than 5% for the fleet on an hourly basis. Below is an example of measured versus SolarAnywhere production data for a single system.
As we’ve undertaken research to quantify SolarAnywhere accuracy, we’ve found that we can often identify system performance problems from the unique fingerprint of measured versus simulated data. These types of problems vary from misreporting of system specifications, to individual inverters being down on multi-inverter systems. We’ll cover this topic in more detail in the coming months, so stay tuned.
This validation exercise not only shed light on the accuracy of SolarAnywhere products, it also highlighted how important benchmarking performance over time is to ensuring that PV system owners—whether individuals or TPOs—get the most out of their investments. Just as importantly, this data can be used to diagnose system problems from the comfort of the control room, often avoiding costly truck rolls.