Posted by: AT&T Blog Team on October 4, 2012 at 3:01 pm
By Joe Marx, AT&T Assistant Vice President of Federal Regulatory
Yesterday, AT&T submitted a formal analysis of the V-COMM Test Report that came out in mid-July claiming minimal impact of Channel 51 and E-Block signals on Band 12 and Band 17 devices. As we previously discussed, there are real credibility issues with the testing conducted by V-COMM for the lower 700MHz A Block licensees.
The V-COMM Report concluded, contrary to multiple other tests and analyses, that although Band 12 devices are far more susceptible to interference from Channel 51 and E Block signals than are Band 17 devices, LTE signals are strong enough to overcome the interference. The problem is that the test assumed an LTE signal strength not likely to occur in most real world situations. After a more thorough review and additional testing by 7Layers (a well-regarded, independent testing firm), it is apparent that the conclusions in the V-COMM Report are wrong, and reflect a number of incorrect assumptions, parameters, and methodology.
There is often more than one approach to take when embarking on an interference test program. In the case of interference testing as with Channel 51 and E-Block, it is not enough to test a best or average case scenario which looks for interference where it is not likely to exist and then claims the results prove it doesn’t exist. That’s like looking for snow at the equator and concluding from its absence that snow doesn’t exist. But that is essentially the case with the V-COMM Report. It is much more useful to look at more challenging scenarios that show the presence of interference and prove those scenarios could occur in a real world. That is the approach taken by AT&T and the test labs on which we relied.
In the case of Channel 51, for example, our initial test cases compared Band 12 and 17 device performance when operating the uplink frequencies at the upper end of the B and C Blocks (centered at 711 MHz), where one would expect the most interference to occur. The testing then determined the point where the device could reliably maintain a data connection with 95% throughput (often referred to as the reference sensitivity) representing the edge of the cell site. The interfering Channel 51 signal was then increased to the point where the performance degraded below the 95% threshold. The tests were then repeated at varying LTE signal strengths representing locations closer to the cell. In all of these tests, the Band 12 devices displayed degraded performance to the point where the data session was unstable and in most cases dropped completely. These tests showed that there are large areas where Band 12 device performance would be significantly degraded as compared with a Band 17 device.
Because V-COMM failed to disclose some very pertinent testing information, we’ve had to reverse engineer its tests to enable apples to apples comparisons that explain V-COMM’s surprising test results. After doing that, and running additional tests with V-COMM’s parameters, it became clear that the major disconnect between our testing and V-COMM testing is the LTE signal strengths and the number of physical resource blocks assigned to the device for the uplink and downlink. V-COMM appears to have chosen LTE signal levels that typically occur closer to the cell site, and it chose physical resource block allocations that typically occur only when just one or two people are attempting to transmit or receive data at the same time. After mimicking V-COMM’s tests (with the same types of testing equipment and the same 710 MHz channel center used by V-COMM), but using the more realistic LTE signal strengths and physical resource block allocation assumptions, 7Layers found that, consistent with the previous testing and analyses, in large areas of the country, Channel 51 transmissions would cause Band 12 devices to operate in severely degraded or inoperable states as compared with a Band 17 device.
V-COMM’s E-Block testing likewise appears to use the same LTE signal levels and physical resource block allocations that fail to represent many real world situations. V-COMM’s E-Block testing also relies on a 20-year-old propagation model that is not compatible with the E block network V-COMM seeks to evaluate. For example, the model assumes that all transmitters will be located below 300 feet, when we know, based upon Qualcomm’s MediaFLO experience, that an E block broadcast network will almost certainly have many transmitters placed at much higher locations. The model also fails to account for differences in propagation in urban, suburban and rural areas, and it fails to account for differences in propagation characteristics among different frequency bands.
In the end, the formal analysis of the V-COMM testing shows that its assumptions drove erroneous and misleading conclusions. AT&T has invested billions of dollars to deploy its advanced LTE network, and the Commission cannot rely on testing sleight of hand to impose a mandate that threatens that investment and will create serious interference issues for our customers.
Instead, the Commission must to address the real world challenges of the lower 700 MHz band by accelerating the relocation of the remaining Channel 51 stations and adjusting the E-Block transmission characteristics to align with the other blocks in Lower 700MHz spectrum band.