The U.S. Department of Energy (DOE) has released the latest findings (Part 2) from a continued investigation into energy losses in Ethernet cables used between Power over Ethernet (PoE) switches and luminaires in PoE connected lighting systems. Testing was conducted in DOE’s Connected Lighting Test Bed in July and August 2018 and was a continuation of studies conducted there in September 2017 (Part 1).
PoE technology offers the ability to provide low-voltage direct-current (DC) power and communication over a standard Ethernet cable. LED technology has reduced the power required for lighting applications, while advances in PoE standards and technology have yielded substantial increases in the amount of power that can be delivered to a networked device over a single cable. As a result, PoE technology is emerging in lighting and many other applications. PoE lighting systems can offer improved energy efficiency relative to traditional line-voltage alternating-current (AC) systems, but this can be offset to some extent by increased voltage drop in the low-voltage Ethernet cabling.
For the new study, a setup consisting of a PoE switch, a set of luminaires, and a reference meter was again used to test multiple cable models of varying design. The results were analyzed to investigate the impact of cable selection and installation practices on PoE lighting system energy efficiency. Additionally, the study aimed to continue the exploration of the effectiveness of guidelines published in American National Standards Institute (ANSI) Standard C137.3-2017, which in the Part 1 study were found to limit cable system power losses to 5% of PoE switch output in PoE lighting applications, provided that the average cable length in the system didn’t exceed 50 meters.
With regard to the impact of cable selection and installation practices on energy efficiency, the study found that with 44 Watt (W) luminaires as powered devices, and room ambient temperatures below 30°C, cable power losses were not substantially increased by cable bending or bundling in uninsulated conduit. However, environments with higher ambient temperatures will have greater power losses due to increased conductor DC resistance (DCR). In addition, product selection and installation practices will take on increased importance, as powered devices approaching 90 W input power – conveyed by a single Ethernet cable – are introduced following publication of the recently approved Institute of Electrical and Electronics Engineers (IEEE) Standard 802.3bt.
With regard to further exploring the effectiveness of the ANSI C137.3 guidance, three cables (two of which were shielded in design, and another of which required the use of patch cords) that had been excluded from the Part 1 study due to compatibility issues were evaluated in the Part 2 study. Results again demonstrated that the guidance limited power losses to less than 5% in the cables tested. However, these collective findings are not representative of all cable models and installation practices; e.g., power losses would be greater when cables are connected to patch cords, bundled in conduit, and loaded with powered devices approaching 90 W input power. Cable losses decreased with increasing conductor diameter, as would be expected based on the corresponding improvements to DCR. No appreciable difference was observed for the other characteristics, but considering the study’s limitations (e.g., the set of cables tested), this doesn’t mean these parameters don’t affect cable losses.
To view the complete findings, including recommendations for manufacturers, specifiers, suppliers, installers, and stakeholders, download the full report.