Shenzhen Golden Future Energy Ltd.,
wall mounted home energy storage system wall mounted home energy storage system

wall mounted home energy storage system

Home >  wall mounted home energy storage system > 

20kW Solar Battery BMS (Battery Management System) Function Testing, Balancing Performance and SOC Accuracy Test

Time:2025-10-20 Views:1

  20kW Grid-Connected Solar Battery Test Plan for Grid Interaction, Harmonic Distortion, Insulation Resistance, Environmental Adaptability, BMS Functionality, and Cycle Life

  I. Test Background and Core Objectives

  With the widespread application of photovoltaic energy storage technology in distributed energy systems, 20kW grid-connected solar batteries, as core devices for energy storage and grid interaction, have a significant impact on grid reliability and overall economic viability. Their operational stability, power quality compatibility, electrical safety, environmental adaptability, BMS (Battery Management System) control capabilities, and long-term cycle life directly impact grid reliability and overall lifecycle economics. This test focused on six key dimensions, aiming to verify that the equipment meets the full-scenario application requirements of "grid connection reliability, safety and compliance, environmental adaptability, BMS accuracy, and long-term durability":

  Ensuring device-grid interaction and coordination with the grid, mitigating grid shocks from power fluctuations and grid switching;

  Controlling harmonic distortion in power output to ensure that the quality of power entering the grid meets national standards;

  Verifying the insulation performance of the equipment to eliminate safety hazards such as leakage and insulation failure, providing a basis for long-term, safe operation of the equipment;

  Assessing the equipment's ability to withstand extreme temperature cycles and harsh outdoor environments to ensure stable operation in diverse regions and climates;

  Verifying the BMS's balancing performance and SOC (State of Charge) accuracy to ensure battery pack consistency, avoid overcharge and over-discharge, and extend overall life;

  Evaluating the capacity decay patterns under 1C charge and discharge conditions to clarify the performance retention rate of the equipment after long-term use, providing data support for lifespan planning and operation and maintenance decisions.

  II. Grid Interaction Testing

  1. Test Significance

  Grid interaction is the core component of the 20kW solar battery's interoperability with the grid. This test simulates the full "charging - discharging - grid-connected power supply" scenario in actual operation to verify the device's ability to respond to changes in grid parameters and prevent abnormal interactions from causing grid frequency or voltage fluctuations or device downtime.

  2. Core Test Indicators and Implementation Standards

  Grid-connected power regulation capability: The reference standard is GB/T 37408-2019. Qualified requirements include a power regulation range of 0-20kW and a response time of no more than 1s.

  Grid fault low voltage ride-through: The reference standard is GB/T 19964-2012. Qualified requirements include a continuous support time of no less than 0.15s when the voltage drops to 0%.

  Off-grid/grid-connected automatic switching: The reference standard is GB/T 34120-2017. Qualified requirements include a switching time of no more than 50ms and no significant voltage surge during the switching process.

  Grid-connected current phase matching: The reference standard is IEC 61727:2004. Qualified requirements include a power factor of no less than 0.95, with either lagging or leading phases acceptable.

  3. Test Method

  A simulated power grid test platform was built, equipped with a grid simulator with adjustable voltage and frequency. The platform simulated normal grid conditions, voltage drops (10%-90% of rated voltage), and frequency fluctuations (49.5Hz-50.5Hz).

  The solar battery was controlled to switch between charge and discharge modes at different power levels (20%/50%/100% of rated power), and the device output power, grid-connected current, and grid parameter change curves were recorded.

  A grid power outage was simulated to test the device's off-grid switching logic. After the grid was restored, the accuracy and stability of the automatic grid connection were verified.

  III. Harmonic Distortion Rate Test

  1. Test Significance

  When solar batteries are connected to the grid, their internal power electronic devices, such as inverters, are prone to generating harmonic currents. If the harmonic distortion rate exceeds the specified value, it can distort the grid voltage waveform, interfere with the normal operation of nearby electrical equipment (such as precision instruments and communications equipment), and even accelerate the aging of grid equipment such as transformers and cables. This test focuses on controlling total harmonic distortion (THD) and the content of each characteristic harmonic.

  2. Core Test Indicators and Implementation Standards

  Based on GB/T 14549-1993 "Power Quality - Public Grid Harmonics" and GB/T 37408-2019 "Technical Requirements for Grid-Connected Photovoltaic Energy Storage Systems":

  Total Harmonic Distortion (THD): Under rated power (20kW) output conditions, grid-connected current THD ≤ 5%;

  Harmonic current content: 3rd harmonic ≤ 4%, 5th harmonic ≤ 3%, 7th harmonic ≤ 2%, 9th and higher harmonics ≤ 1% (all based on rated current).

  3. Test Method

  Connect a high-precision power quality analyzer (such as the Fluke 6100B) to the grid-connected output of the device (the point where the inverter connects to the grid). Set the sampling frequency to ≥ 2kHz and the sampling duration to ≥ 10 grid cycles.

  Test the device's harmonic data at 25%, 50%, 75%, and 100% of rated power, recording the total harmonic distortion (THD) and the current amplitude and phase of harmonics 3-31.

  Compare to standard limits and analyze the harmonic sources (such as inverter switching frequency and filter circuit parameters). If these values exceed the standard, propose optimization suggestions for the filtering solution (such as adding an LC filter).

  IV. Insulation Impedance Test

  1. Test Significance

  Insulation impedance is a key indicator for measuring the electrical safety of solar cells and is directly related to the risk of electric shock to equipment operators and nearby personnel. During long-term operation, factors such as high temperatures, humidity fluctuations, and electrolyte leakage can cause insulation degradation. If insulation resistance is too low, this can easily lead to leakage, short circuits, and even fire. This test covers the equipment's critical electrical circuits to ensure that insulation performance meets safe operation requirements.

  2. Core Test Indicators and Implementation Standards

  Based on GB/T 17215.301-2007 "AC Current Measuring Equipment - Particular Requirements - Part 1: Active Energy Meters" and IEC 62109-2:2011 "Safety Requirements for Photovoltaic Inverters - Part 2":

  Insulation resistance between the positive and negative terminals of the battery and the device housing: ≥ 10MΩ at a DC 500V test voltage;

  Insulation resistance between the inverter output (AC side) and the housing: ≥ 5MΩ at an AC 1000V test voltage;

  Adaptability to hot and humid environments: After 48 hours in an environment with a temperature of 40°C and a relative humidity of 90%, the insulation resistance must still meet the above requirements.

  3. Test Method

  Before testing, disconnect all external power supplies and ground connections to ensure the test circuit is independent.

  Use an insulation resistance tester (such as the KEW 3125) to apply the specified test voltage to the "battery positive - case," "battery negative - case," and "AC output - case" connections for 1 minute, then read the impedance values.

  Place the device in a constant temperature and humidity chamber to simulate a hot and humid environment, then repeat the test to verify the stability of the insulation performance.

  If the impedance value is below the standard limit, investigate for insulation damage or moisture on the terminal blocks. Repair the problem and retest.

  V. Environmental Adaptability Testing

  (I) -40°C to 60°C Temperature Cycle Test

  1. Test Significance

  20kW grid-connected solar cells are often deployed outdoors. They must withstand extreme operating conditions, ranging from low temperatures (-40°C) in winter at high latitudes to high temperatures (60°C) in summer at low latitudes. These drastic temperature fluctuations can easily lead to degradation of active materials, electrolyte solidification/evaporation, and casing cracking. This test simulates extreme temperature cycles to verify the device's performance stability and structural integrity under alternating temperature fluctuations.

  2. Core Test Indicators and Implementation Standards

  Based on GB/T 31485-2015 "Safety Requirements for Power Batteries for Electric Vehicles" and IEC 61215:2021 "Terrestrial Crystalline Silicon Photovoltaic Modules — Design Qualification and Type Approval":

  Cycling Parameters: Total number of cycles ≥ 50, with a single cycle consisting of four stages: "High temperature phase (60°C ± 2°C, 4 hours) → Cooling (at a rate of 5°C/min) → Low temperature phase (-40°C ± 2°C, 4 hours) → Warming (at a rate of 5°C/min)";

  Performance Requirements: After cycling, the battery's rated capacity decay is ≤ 10%, charge and discharge efficiency is ≥ 90% at a rated power of 20kW, grid-connected interaction functions (power regulation and switching logic) are normal, the casing is free of cracks or deformation, and the terminals are not loose;

  Safety Requirements: No leakage, smoke, or bulging occurs during cycling, and the insulation resistance still meets the requirement of "≥ 10MΩ at 500V DC."

  3. Test Method

  Place the device (including the complete battery pack and inverter module) in a high-temperature and low-temperature humidity test chamber. Connect a temperature recorder, charge-discharge tester, and grid-connected simulation system to monitor the battery cell voltage, module temperature, and output power in real time.

  Start the test according to the set cycle parameters, pause after every 10 cycles, and let it rest at room temperature for 2 hours. Measure the current capacity and charge-discharge efficiency, and record the data trend.

  After 50 cycles, visually inspect the device's structural integrity, retest the insulation resistance and grid-connected interaction performance, and compare the initial data to verify the degree of attenuation.

  (II) IP66 Protection Rating Test

  1. Test Significance

  Sand and dust in outdoor environments can easily clog the device's heat dissipation channels and corrode metal contacts. Rainwater leakage can cause internal circuit shorts. IP66 protection is the basic outdoor adaptation requirement for photovoltaic energy storage equipment ("Level 6 Dustproof" means completely preventing dust intrusion, and "Level 6 Waterproof" means withstanding strong water jets). This test verifies the device's casing's ability to protect against the intrusion of solid foreign objects and liquids.

  2. Core Test Indicators and Implementation Standards

  According to GB/T 4208-2017 "Degrees of Protection Provided by Enclosures (IP Code)":

  Dustproof (IP6X): The device is placed in a dust test chamber with a talcum powder concentration of 2kg/m³ for 8 hours. After the test, no visible dust is present inside the device, and the electrical components (inverter, wiring terminals) function normally.

  Waterproof (IPX6): Using a 12.5mm inner diameter nozzle, water is sprayed continuously for 1 minute/m² (the total area is calculated based on the device's external surface) from a distance of 2.5m to 3m, at a water pressure of 100kPa ± 5kPa and a water flow rate of 100L/min ± 5L/min. All external surfaces (front, sides, top, and bottom) of the device are sprayed with water for 1 minute/m² (the total area is calculated based on the device's external surface). After the test, there are no signs of water intrusion inside the device, the insulation resistance is ≥10MΩ (DC 500V), and the grid-connected function is normal.

  3. Test Method

  Dust Resistance Test:

  Power off the device, close all external interfaces (such as communication and charging ports), and secure it in the dust test chamber as normal.

  Start the test chamber and maintain a talcum powder suspension for 8 hours. After the test, remove the device, remove the outer casing, inspect the interior (focusing on the inverter cavity and battery junction box) for residual dust, and then power on to test basic functions.

  Water Resistance Test:

  The device remains powered on and connected to a grid-connected simulation system, monitoring output power and insulation resistance in real time.

  Adjust the nozzle parameters according to standard requirements and spray each external surface of the device with strong water. Pause for 5 minutes after each surface to check for water ingress into the outer casing gaps (such as the panel-to-box connection and heat dissipation holes). Simultaneously record changes in electrical parameters.

  After testing all surfaces, let the device sit for 24 hours. Then retest the insulation resistance and grid-connected performance to confirm there are no hidden faults.

  VI. BMS Function Testing (Battery Management System Function Testing)

  (I) Balancing Performance Test

  1. Test Significance

  A 20kW solar battery pack consists of multiple cells connected in series or parallel. After long-term charge and discharge, variations in cell capacity and internal resistance can easily lead to imbalance. This can cause some cells to overcharge (risk of bulging) and others to overdischarge (waste of capacity), directly shortening the overall life of the battery pack. The BMS's balancing function actively or passively adjusts cell charge and discharge currents to minimize cell variations, a key means of ensuring battery pack consistency. This test verifies the effectiveness and efficiency of this balancing function.

  2. Core Test Indicators and Implementation Standards

  According to GB/T 34131-2017 "Technical Requirements for Power Storage Battery Management Systems for Electric Vehicles" and IEC 61508-2010 "Functional Safety of Electrical/Electronic/Programmable Electronic Safety-Related Systems":

  Balancing Start Condition: When the cell voltage difference ≥ 50mV (or the SOC difference ≥ 5%), the BMS must automatically start the balancing function;

  Balancing Capacity: Passive balancing current ≥ 100mA/cell, active balancing current ≥ 1A/cell (determined by BMS type);

  Balancing Effect: Within 4 hours after balancing is started, the cell voltage difference must drop to ≤ 20mV, and no cells must be overcharged (exceeding the rated voltage by 10%) or over-discharged (below the cut-off voltage by 10%) during the balancing process;

  Consistency Retention: After 100 1C charge-discharge cycles, the cell voltage difference of the battery pack with balancing enabled must be ≤ 30mV, and that of the pack without balancing enabled must be ≤ 100mV (for comparison to verify balancing effectiveness).

  3. Test Method

  Establishing an Unbalanced State:

  Disassemble the battery pack into cells and charge and discharge them at varying depths (e.g., charge some cells to 100% SOC and others to 50%) until the cell voltage difference reaches 80mV (exceeding the balancing threshold). Reassemble the battery pack and connect it to the BMS and a charge/discharge tester.

  Record the initial state: cell voltage, total voltage, and SOC display values to confirm that the BMS has recognized the unbalanced state.

  Balancing Function Verification:

  Start the charge and discharge system at a 0.5C current (simulating photovoltaic charging conditions). Simultaneously, enable the BMS balancing function. Record the cell voltage every 10 minutes using a high-precision voltage acquisition module (accuracy ±0.01V) and plot a "time-voltage difference" curve.

  Monitoring Balancing Current: Passive balancing calculates the current by measuring the voltage drop across the cell parallel resistors. Active balancing reads the balancing module output current through the BMS communication interface to verify that the specification is met.

  After balancing is complete, allow the battery to rest for 1 hour and re-measure the cell voltage difference to determine if it meets the requirements."≤20mV" requirement.

  Long-term consistency verification:

  Two groups were compared: Group A with BMS balancing enabled and Group B with balancing disabled. Both groups underwent 100 1C charge-discharge cycles.

  After the cycles, the cell voltage difference and capacity decay rate of the two groups were measured to compare and verify the effectiveness of the balancing function in improving battery pack consistency.

  (II) SOC Accuracy Test

  1. Test Significance

  SOC (State of Charge) is the core basis for scheduling PV energy storage systems. If the SOC is too high, it may lead to over-discharge (battery damage); if it is too low, it may lead to under-charging (wasted energy storage capacity). The SOC accuracy of the BMS directly affects the effectiveness of the charge and discharge control strategy. This test covers static, dynamic, and temperature-varying operating conditions to verify the accuracy of the SOC calculation.

  2. Core Test Indicators and Implementation Standards

  Based on GB/T 34131-2017 "Technical Requirements for Power Battery Management Systems for Electric Vehicles" and SAE J1716-2019 "Electric Vehicle Battery State of Charge (SOC) Test Method":

  Static SOC Accuracy: At 25°C ± 2°C, after the battery has been allowed to rest for 24 hours (to eliminate polarization effects), the error between the BMS displayed SOC and the "discharge method measured SOC" should be ≤3%;

  Dynamic SOC Accuracy: During 1C charge/discharge, the error between the real-time SOC display and the "accumulated charge and discharge capacity converted SOC" should be ≤5%;

  Temperature Adaptability: At -20°C, 0°C, 25°C, and 50°C, the static SOC error is ≤5% and the dynamic SOC error is ≤8%;

  Recovery Capacity: After the battery is deeply discharged to SOC = 0% (cut-off voltage) and allowed to rest for 12 hours, the BMS displayed SOC Rebound value ≤ 2% (avoid false SOC).

  3. Test Method

  Static SOC Accuracy Calibration:

  Control the ambient temperature at 25°C ± 2°C. Charge the battery pack at 0.2C to 100% SOC (cut-off voltage), let it rest for 24 hours, and record the SOC₁ displayed by the BMS.

  Discharge the battery pack at a constant current of 0.2C to 0% SOC (cut-off voltage). Record the actual discharge capacity, Cact. Calculate the measured SOC: SOCact = (Cact - Discharged Capacity) / Cact × 100% (record every 20% SOC during discharge).

  Compare the BMS displayed SOC with SOCact and calculate the error: Error = |SOC Display - SOC Actual| / SOC Actual × 100%. Verify that it is ≤ 3%.

  Dynamic SOC Accuracy Test:

  At 25°C, charge at 1C (from 0% to 100%), recording the BMS displayed SOC₂ and the cumulative charge capacity Ccharge every 10 minutes. Convert SOCcharge = Ccharge / Cactual × 100%, with a calculated error of ≤5%.

  Then discharge at 1C (from 100% to 0%), recording the BMS displayed SOC₃ and the cumulative discharge capacity Cdischarge every 10 minutes. Convert SOCdischarge = (Cactual - Cdischarge) / Cactual × 100%, with a calculated error of ≤5%.

  Temperature Adaptability Test:

  Place the device in a high and low temperature test chamber set to -20°C, 0°C, and 50°C. Allow 4 hours at each temperature point (to stabilize the temperature), and repeat the "static + dynamic" test steps.

  Record the SOC error at different temperatures to verify that it meets the "static ≤ 5%, dynamic ≤ 8%" requirements.

  Recovery Capacity Verification:

  Discharge at 1C to SOC = 0% (cut-off voltage), disconnect the charge and discharge system, and let it rest for 12 hours.

  Record the SOC recovery value displayed by the BMS, which must be ≤2% (if the recovery is too high, it means that the BMS has not eliminated the effects of polarization, resulting in an error in the SOC calculation).

  VII. Cycle Life Testing - 1C Charge and Discharge Capacity Decay Rate

  1. Test Significance

  1C charge and discharge is a typical operating condition for a 20kW grid-connected solar battery ("1C" refers to charging and discharging at a current of 1 times the rated capacity of the battery. For example, a 20kW/h battery has a 1C charge current of 20A and a discharge current of 20A). Capacity decay under long-term cycling directly determines the lifecycle value of the device. Excessive decay will shorten the energy storage time and reduce the grid-connected power supply capacity, requiring premature device replacement and increasing operation and maintenance costs. This test simulates long-term 1C cycling to quantify the capacity decay rate and verify whether the device meets the design life requirements (typically 5-10 years).

  2. Core Test Indicators and Implementation Standards

  According to GB/T 31484-2015 "Cycle Life Requirements and Test Methods for Power Batteries for Electric Vehicles" and IEC 62620-2018 "Lithium Secondary Batteries and Battery Packs for Industrial Applications":

  Basic Parameters: The test environment temperature is controlled at 25°C ± 2°C (simulating normal operating conditions). The initial battery capacity must be calibrated using the "0.2C charge to rated voltage → 1 hour standstill → 0.2C discharge to cut-off voltage" procedure, recorded as C₀.

  Cycling Parameters: 1C constant current charge to the battery's rated voltage (e.g., approximately 3.65V/cell for lithium iron phosphate batteries), then constant voltage charge until the current drops to 0.05C (charge cut-off). After 30 minutes of standstill, discharge at 1C constant current to the discharge cut-off voltage (e.g., approximately 2.5V/cell for lithium iron phosphate batteries), completing one cycle.

  Capacity Fade Limit: Cycle After 1000 cycles, the actual battery capacity (C₁₀₀₀) must be ≥80% of C₀; after 2000 cycles, C₂₀₀₀ must be ≥70% of C₀ (if the device has a design life of 10 years, assuming an average of 300 cycles per year, 2000 cycles corresponds to approximately 6.7 years, which must meet degradation requirements).

  Supplementary requirements: During cycling, the battery cell voltage difference must be ≤50mV, the outer shell must not bulge or leak, the charge and discharge efficiency (discharge capacity/charge capacity) must be ≥95%, and the insulation resistance must still meet the requirement of "≥10MΩ at 500V DC" after each cycle.

  3. Test Method

  Pretreatment and Initial Capacity Calibration:

  Place the device battery pack (disassemble into the smallest test unit or keep the entire module, consistent with actual application) in a constant temperature chamber (25°C ± 2°C). Connect a high-precision charge and discharge tester (such as the Xinwei BTS-9000) and record the single cell and total voltages.

  Charge at 0.2C to the rated voltage, then hold the voltage constant until the cutoff is 0.05C. Allow the battery to rest for 1 hour. Discharge at 0.2C to the cutoff voltage and record the discharge capacity, which is the initial capacity C₀. Repeat this calibration twice and take the average value to ensure accuracy.

  1C Cycle Test Execution:

  Initiate the cycle according to the following procedure: "1C constant current charge → constant voltage charge to 0.05C → rest for 30 minutes → 1C constant current discharge to cutoff voltage." After every 100 cycles, pause the test and re-measure the current capacity Cₙ using the "0.2C charge and discharge" method (n is the number of cycles). Calculate the capacity decay rate: decay rate = (C₀ - Cₙ) / C₀ × 100%;

  Real-time monitoring of key parameters during the cycle: charge time, discharge time, cell voltage extremes (maximum/minimum voltage), and battery surface temperature (must be ≤45°C). If a cell voltage difference exceeds 50mV, the temperature rises abnormally (>50°C), or the battery case deforms, immediately pause the test and analyze the cause.

  Endpoint Determination and Data Summary:

  When the re-measured capacity Cₙ is less than 80% C₀ on a particular cycle, stop cycling and record the total number of cycles to that point (i.e., the actual cycle life). If C₂₀₀₀₀≥70% C₀ after 2000 cycles, stop according to the test target (or continue until the capacity decay reaches the limit).

  Collate all cycle data and plot a "number of cycles - capacity decay rate" curve. Analyze the decay pattern (e.g., rapid decay in the first 500 cycles, followed by a gradual decline). Verify the device's grid-connected functionality after cycling (e.g., whether the 20kW power output is normal).

  VIII. Test Summary and Application Value

  This 20kW grid-connected solar battery was subjected to six core tests, encompassing grid interoperability, power quality, electrical safety, environmental adaptability, BMS accuracy, and long lifespan, to establish a comprehensive performance verification system. Its application value is reflected in four key areas:

  Grid Adaptability Assurance: Grid interaction and harmonic distortion rate tests ensure that the device does not disrupt public power quality after connecting to the grid. Low voltage ride-through and phase matching capabilities meet grid dispatch requirements, reducing the risk of grid failures.

  Safety and Environmental Compatibility: Insulation impedance testing eliminates electric shock hazards. -40°C to 60°C temperature cycling and IP66 testing ensure reliable operation in outdoor environments, including extreme cold and heat, dust, and heavy rain, making it suitable for deployment in diverse climate zones across China.

  BMS Precision Control: Balancing performance testing ensures cell consistency in the battery pack, preventing local overcharge and over-discharge, and extending overall lifespan. SOC accuracy testing ensures on-demand energy storage scheduling, improving capacity utilization and minimizing battery damage.

  Lifecycle Economic Efficiency: 1C Cycle life testing quantifies capacity degradation patterns and, combined with BMS control effectiveness, provides users with recommended equipment replacement cycles (e.g., 1500 cycles plus BMS balancing can extend the lifespan to 8-10 years), avoiding the cost and safety risks associated with premature retirement or overuse.

  Test results can be directly used for equipment factory certification, technical evaluation in project bidding, and cross-brand performance comparisons. They also provide data support for the "equipment selection - operation and maintenance planning - lifecycle cost estimation" process for photovoltaic energy storage systems, driving the development of distributed energy systems toward high reliability, long life, and low cost.

Wall type home energy storage

Read recommendations:

UPS - Dedicated Energy Storage Lithium - Ion Batteries

portable station power bank equipment

PB-MN500W

PB600US

Cabinet type home energy storage