by Franck Latrémolière on Friday 15 March 2013 (corrected 25 March 2013)
Update 26 March 2013: I have now written a brief update on this proposal (and published the relevant documents).
This article highlights problems with the DCP 143 change report that has been submitted by the DCP 143 working group to the March 2013 DCUSA panel meeting. Maybe the DCUSA panel will find a way of solving these problems. I'll eat appropriate humble pie for shouting out too early if they do.
The documents that I am commenting on are on the password-protected part of http://dcusa.co.uk/ which hosts documents for meetings "held in open session". I think that these documents should be published without a password barrier, but I am not currently brave enough to just republish them myself. But I answer emails sent to email@example.com and I am sometimes helpful...
DCP 143 is a proposal to amend the DCUSA. The proposal was raised in July 2012 by one supplier following discussions in the "DCMF MIG billing supergroup", a forum that discussed inconsistencies in billing processes between distributors that still existed despite the implementation of the CDCM, a common distribution charging methodology.
The problem itself is simple and real.
For half hourly settled sites, suppliers have a duty to arrange for a data collector to read the meter and pass on the data to relevant distributor. The relevant data include both active power and reactive power data. Energy settlement only relies on active power data; distributors are the main users of reactive power data.
Sometimes (for 2-20 per cent of relevant meters, according to data collected by the working group), the reactive power data are missing. Some distributors then use various distributor-specific methods to estimate reactive power data and use the estimated data in order to bill the supplier for excess reactive power charges (p/kVArh) and exceeded capacity charges (p/kVA/day).
In most cases the suppliers will merely pass-through these charges to the customer. But suppliers still care, because of the potential for disputes with customers, and because different approaches to estimation make the supplier's work of validating distributor charges more difficult.
The solution proposed by DCP 143 is to mandate that whenever reactive power data are missing, all distributors must assume a power factor of 0.9.
Thus, for each MWh consumed, they must bill as if 484 kVArh had been imported (because SQRT(1+0.484^2) = 1/0.9). (An earlier version of this article gave an erroneous figure of 436 kVArh here.)
An assumed reactive power import of 436 kVArh with active import of 1 MWh automatically generates an excess of 156 kVArh over the 329 kVArh allowed by the CDCM rule that only reactive power that pushes the power factor below 0.95 is chargeable (SQRT(1-0.95^2)/0.95=0.329). CDCM reactive power charges vary between about 0.1p/kVArh and 0.666p/kVArh, so we are talking about an additional charge of up to about £1/MWh. (An earlier version of this article gave erroneously low figures here.) Not the end of the world, but usually worth worrying about.
If the active power consumption is close to the maximum agreed capacity, then the use of an estimated power factor of 0.9 will also trigger estimated exceeded capacity charges. CDCM capacity charges are up to 9.67 p/kVA/day, and in the very worst-case scenario the estimation policy would add 111 kVA for each MW consumed. Keeping with the worst-case scenario, a single half hour consumption of 1 MW would trigger a month of exceeded capacity charges, so 30*111*0.0967 = £322 for half a MWh. But in most cases the cost will be much less, or zero.
Whilst the DCUSA change only provides for the distributor to bill these amounts to the supplier, given the nature of these charges (which are highly customer specific and nothing to do with wholesale energy markets) the customer is likely to be the final payer under its agreement with his supplier.
When data have been estimated by a distributor because the supplier's data collection agent had not provided it, the root cause of the charges was probably a failure in the supplier's meter data collection process over which the customer has no control. And yet the customer would probably still be charged.
But, fear not, Ofgem is here to protect the interests of customers, right?
A DCUSA working group was assembled. It included various industry people (mainly billing specialists). There was also an Ofgem observer, Stephen Perry, and the change report says that "Ofgem has been fully engaged throughout the development of DCP 143".
There are records of five meetings (including one teleconference). DCUSA parties were consulted twice, first in general terms and then with a set of specific supplementary questions. Good-quality answers were received.
Customers are the ones who would bear most of the costs in the areas served by distributors who do not currently apply estimated reactive power charges (about half of the country, by the looks of it). They are not generally DCUSA parties. No attempt seems to have been made at consulting them. Why? Was the Ofgem observer asleep on the job? Failure #1
The change report says that meetings were held in open session and the minutes and papers of each meeting are available on the DCUSA website. That's very nearly true, with one important exception: there are no minutes from the final meeting, even though it was all the way back on 3 January 2013. So if I find any problems with the change report then I'll have no way of knowing to what extent members of the working group or the Ofgem observer were complacent in allowing those to go through. Failure #2
Now to the change report document itself.
A crucial appendix is missing. Paragraph 4.6 "The Working Group noted that the justification for choosing the power factor of 0.9 came from the BEAMA paper. The Working Group agreed to attach a copy of this paper as an appendix for reference with the Change Report". No such appendix in the package of documents. Failure #3
Paragraph 4.23 gives the number of consultation respondents (remember: this excludes the main victims, customers) who thought that the change would better meet each relevant objective. It does not report the number of respondents who thought that the change would not better meet each relevant objective (even though cogent and targeted objections were included in the consultation responses). Failure #4
It looks like consultation responses themselves were ignored when it suited the working group to do so. Three examples:
Seven failures is enough as a start. Would I find more if I kept looking?