亚洲精品视频免费观看|激情五月开心综合亚洲|久久久精品免费国产四虎|亚洲综合色在线一区二区|伊人久久大香线蕉av综合|国产婷婷一区二区在线观看|香蕉碰碰人人a久久动漫精品|国产日韩亚洲不卡高清在线观看

10 Problems in Installation of Cabinet
Release Time:2019-3-29 15:39:52      Hit Count:2390

FeiKai Electronics has solved many problems related to heat in many cold channel data center solutions. For the central operators, a small list of 10 problems for friends to refer to.

1. Hidden Leakage: Cold air leaks from the space under the moving floor and enters adjacent space or support pillars. This leakage is quite common and leads to pressure loss in the cold passage room environment, so that hot or humid air filled with dust enters the room elsewhere. The only way to avoid this problem is to go under the movable floor and check the surrounding and supporting columns, and seal any loopholes you find.

2. Too many perforated floors: There is no reason to place perforated floors in hot passages and blank areas. This wastes cooling capacity. It is also possible that there are too many perforated floors in the air intake of the rack. IT rack top temperature below normal temperature is a major danger signal.

3. Unsealed movable floor openings: Although many cold channel data center operators have tried to seal cable openings and other loopholes in movable floor, few people have really completed this work. The remaining loopholes can cause a large amount of cold air to escape into undesirable areas. Electrical equipment, such as distribution units or remote power supply boards, is a particularly important place to find unsealed openings.

4. The airproof of the rack is not good: It is common knowledge of air flow management to place the spare panel in the empty cabinet area, but not everyone will do so. Some cabinets were not designed, and the installation rail was sealed between the edge of the cabinet. Operators concerned with efficiency seal the openings and potential openings at the bottom of the cabinet.

5. Temperature and humidity sensor calibration is not accurate: sometimes suppliers use uncalibrated sensors, and sometimes the calibration will become inaccurate over time. This will result in poorly managed cooling units not working together.

6. Let CRAC control humidity by mutual restraint: Another good way to restrain the two CRAC is to supply the adjacent CRAC with circulating air at different temperatures. As a result, CRAC gets different humidity readings, one ending the humidification and the other drying the air. To solve this problem, we need to understand the humidity map skillfully and set the humidity control point accurately.

7. Things are scarce and expensive: many cold-channel data center operators preset excessive cooling capacity. If the cooling capacity is greater than needed and there is no guarantee of the safety of excess CRACS, the whole cooling scheme will be involved, because too many units are in a low efficiency state. When the cooling temperature under the floor is very high and some racks are difficult to be cooled, the operator's consistent response is to run more cooling units. However, contrary to intuition, the right thing to do is to run fewer CRACS to reduce the load.

8. The idle cabinet space: This is another obvious factor, but for some reason it is not valued by everyone. When one or more cabinet spaces are empty, the air flow balance will be destroyed, leading to exhaust gas circulation into the cold passage, or loss of cold air in the cold passage. This will lead to over-cooling and supply more air than is actually needed to compensate for the loss.

9. Bad rack layout: Ideally, you want to line up the rack by heating/cooling and place the main CRACs at both ends of each line. Having a small rack and no specific direction doesn't help anyone. It's no use arranging the racks from the front to the back or aligning CRACs with IT lines.

10. Cooling management has not been given due attention: the benefits of improving cooling management have not been expected, which has stranded the capacity of operators and cost higher operating costs. Simple tasks such as installing spare panels can benefit, but they are often overlooked. In extreme cases, a well-managed data center cooling system can even delay expansion or build new facilities.

News you are interested in
Previous:Maintenance of Computer Room and Network Cabinet
Next:Technical Performance and Network Layout of Network Cabinet

Back to list