
EC-ThorT5000

CSB1-N4AGXOrin

CSR2-N72R3399

EC-ThorT5000

CT36L/CT36B

EC-ThorT5000
Powered by high-computing BM1684X, this server features up to 8 BM1684X computing modules. It delivers
exceptional computing power,
enabling up to 256TOPS (INT8) peak computing power, or 128TFLOPS (FP16/BF16) computing power, or 16TFLOPS (FP32)
high-precision
computing power, with the power of multi-channel video processing capability. It provides multiple algorithm
migration and a one-stop deep
learning development toolkit. With the BMC management system, secondary development is supported
The BM1684X SoC with AI computing power features an octa-core Cortex-A53 + NPU. Cortex-A53 with a frequency of
2.3G. NPU
delivers 32Tops(INT8) / 16TFLOPS(FP16/BF6) / 2TFLOPS(FP32) computing power. The server provides 16GB LPDDR4 and
128GB
eMMC large storage capacity
The server can be equipped with up to 8 BM1684X computing modules. Users can customize the number and storage
configuration of
modules. This flexibility makes it suitable for various scenarios
It supports up to 256TOPS (INT8) peak computing power, or 128TFLOPS (FP16/BF16) computing power, or 16TFLOPS
(FP32) high-precision
computing power, which can meet the application requirements for deep-learning model development
The server supports up to 256-channel H.264 1080p@25fps video decoding, 256-channel H.264
1080P@25fps HD video processing
(decoding + AI analysis), and 96-channel H.264 1080p@25fps video encoding
This server supports algorithm migration for pedestrian/vehicle/object recognition, video structuring,
trajectory behavior analysis, etc.,
featuring high security and reliability. So it can be flexibly applied to various product development
SOPHON SDK (BMNNSDK2), a one-stop deep learning development toolkit, provides a series of software tools,
including the underlying driver
environment, compiler, and inference deployment tool. It supports mainstream frameworks:
Caffe/TF/PyTorch/Mxnet/Paddle, mainstream network
model and custom operator development, Docker containerization, and rapid deployment of algorithm applications
BMC intelligent management system provides a visualization console interface through HDMI output. It easily
achieves real-time monitoring,
software configuration, hardware management, troubleshooting, and system upgrade. Secondary development is also
supported
Dual 10GbE ports and GbE ports meet business scenarios with high bandwidth requirements. The independent BMC
management network
interface separates the management network and the service network, ensuring the security and reliability of
network communication
A 3.5-inch hard disk bay supports SATA3.0 HDD/SSD hard drive expansion, allowing the device to be easily expanded to a TB storage capacity
Featuring a standard 1U rack server chassis design with a depth of 576mm, this server perfectly matches a wide range of data center cabinets
This server can be widely used in various scenarios, including edge computing, cloud storage, blockchain,
multi-channel video
encoding and decoding, and intelligent security
Edge computing
Security
Cloud storage
Blockchain
Multi-channel video
encoding and decoding
Financial industry
| CSA1-N8S1684X | ||
Technical Specifications |
Server form |
1U rack-mounted computing power server |
| Architecture |
ARM architecture |
|
| Number of nodes |
8 distributed computing nodes + 1 control node |
|
| Compute nodes |
Octa-core 64-bit processor BM1684X, up to 2.3GHz |
|
| Video encoding |
H.264: 3×4K@25fps, 12×1080P@25fps |
|
| Video decoding |
H.264: 8×4K@25fps, 32×1080P@25fps |
|
| Control nodes |
Octa-core 64-bit processor RK3588S, main frequency up to 2.4GHz, the highest computing power is 6TOPS |
|
| AI computing power |
256TOPS (32T × 8, INT8) |
|
| RAM |
16GB LPDDR4/LPDDR4X × 8 |
|
| Storage |
64GB eMMC × 8 (64GB/128GB Optional) |
|
| Storage expansion |
3.5-inch/2.5-inch SATA3.0 SSD hard drive slot × 1 (Supports hot swapping; BMC can directly operate the hard drive, and computing child nodes can indirectly access the hard drive through the network sharing method provided by BMC) |
|
| Power consumption |
Normal: 300W, Max: 430W |
|
| Fan module |
6 high-speed cooling fans |
|
Physical Specifications |
Size |
490.0mm(L) × 417.3mm(W) × 44.4mm(H) |
| Installation requirements |
IEC 297 Universal Cabinet Installation: 19 inches wide and 800 mm deep and above Retractable slideway installation: The distance between the front and rear holes of the cabinet is 543.5mm~848.5mm |
|
| Full weight |
Server net weight: 6.7kg, total weight with packaging: 8.9kg |
|
| Environment |
Operating Temperature: 0ºC ~ 42ºC, Storage Temperature: -40ºC ~ 60ºC, Operating Humidity: 20% ~ 80%RH (non-condensing) |
|
Software Specifications |
BMC |
The BMC management system is integrated with the web-based management interface, supporting Redfish, VNC, NTP, monitoring advanced and virtual media, and the BMC management system can be redeveloped |
| Large language models |
BM1684X: Support the privatization of ultra-large-scale parametric models under the Transformer architecture, such as Deepseek-R1 series, Gemma series, Llama series, ChatGLM series, Qwen series, Phi series and other large language models |
|
| Visual large model |
BM1684X: Support the privatization deployment of large visual models such as ViT, Grounding DINO, SAM, etc. |
|
| AI Painting |
BM1684X: Support the private deployment of Flux, Stable Diffusion, and Stable Diffusion XL image generation models |
|
| Deep learning |
All models: Support traditional network architectures such as CNN, RNN, LSTM, and support various deep learning frameworks such as TensorFlow, PyTorch, PaddlePaddle, ONNX, and Caffe. Support custom operator development and Docker containerization management technology |
|
Interface Specifications |
Internet |
2 × 10G Ethernet (SFP+), 2 × Gigabit Ethernet (RJ45, 1 management network port, 1 ordinary network port) |
| Display |
1 × HDMI2.0 (Maximum resolution 1080P, main processor core board display) |
|
| USB |
2 × USB3.0 HOST, 1 × Type-C (USB3.0 OTG, processor core board debugging) |
|
| Others |
1 × SIM Card, 2 × 4G antenna |
|