All products and more detailed information
Use the selection tool to accurately find products
Use the comparison tool to choose products
CSR2-N72R3399
CSB1-N10NOrinNX
AIBOX-OrinNX
EC-OrinNX
Station P2S
iHC-3568JGW
IPC-M10R800-A3399C V2
Core-3588JD4
iCore-3562JQ
Core-186JD4
AIO-3588JD4
ROC-RK3576-PC
CT36L/CT36B
NVIDIA Jetson Orin Nano edge computing module (8GB version) features a hexa-core ARM CPU and a 1024-core NVIDIA
Ampere
architecture GPU with 32 Tensor Cores. It is capable of running multiple concurrent AI application pipelines,
providing
powerful inference performance.
AIBOX-OrinNano delivers up to 40 TOPS of computing power and provides a 128GB PCIe NVMe SSD. The JETSON
system,
based on Ubuntu 22.04,
offers a comprehensive desktop Linux environment with accelerated graphics and supports libraries
such as
NVIDIA
CUDA,
TensorRT, and CuDNN. The device is suitable for a wide range of AI application scenarios. (Support for
the above content varies across different versions of the JetPack SDK. For detailed information,
please refer to the NVIDIA official website.)
NVIDIA Jetson Orin offers unparalleled AI compute, large unified memory, and comprehensive software stacks,
delivering superior
energy efficiency to drive the latest generative AI applications. It’s capable of fast inference for any
generative AI models powered by
the transformer architecture, providing superior edge performance on MLPerf.
Democratize edge AI and robotics development with the world's most comprehensive AI software stack and ecosystem,
powered by
generative AI at the edge and the NVIDIA Metropolis and Isaac™ platforms. NVIDIA JetPack™, Isaac ROS, and
reference AI workflows enable
seamless integration of cutting-edge technologies into your products, eliminating the need for costly internal AI
resources. Experience
end-to-end acceleration for AI applications and speed your time to market using the same powerful technologies
that drive data centers
and cloud deployments.
AIBOX-OrinNano supports up to 11-channel 1080p30 H.265 video decoding, or 1×4K60 (H.265), 2×4K30 (H.265), and
5×1080p60 (H.265) video
decoding,meeting the diverse demands for AI application scenarios.
The AI box features an industrial-grade all-metal enclosure with an aluminum alloy structure for thermal
conduction. The side of the top cover
features a grille design for external airflow and efficient heat dissipation, ensuring computing performance and
stability
even under high-temperature
operating conditions.Its top cover is a porous hexagonal design, combining elegance with high efficiency. The
compact, exquisite device operates
stably and meets the needs of various industrial-grade applications.
AIBOX-OrinNano is widely used in intelligent surveillance, AI education, services based on computing
power, edge computing, private deployment of large models, data security, and privacy protection.
AIBOX-OrinNX ( 16 GB ) | AIBOX-OrinNano ( 8 GB ) | ||
Basic Specifications | Module |
NVIDIA Jetson OrinNX (16GB) module |
NVIDIA Jetson OrinNano (8GB) module |
CPU |
Octa-core 64-bit Arm Cortex-A78AE v8.2 processor, up to 2.0GHz |
Hexa-core 64-bit ARM Cortex-A78AE v8.2 processor, up to 1.5GHz |
|
AI Performance |
100 TOPS |
40 TOPS |
|
GPU |
1024-core NVIDIA Ampere architecture GPU with 32 Tensor Cores |
||
Video Encoding |
H.265:1x4K60、3x4K30、6x1080p60、12x1080p30 |
H.265:1080p30 |
|
Video Decoding |
H.265:1x8K30、2x4K60、4x4K30、 9x1080p60、18x1080p30 |
H.265:1*4K60、2*4K30、5*1080p60、11*1080p30 |
|
Memory |
16GB LPDDR5 |
8GB LPDDR5 |
|
Storage Expansion |
128GB PCIe NVMe SSD (installed inside the device) |
||
Power |
DC 12V(DC 5.5 * 2.1mm) |
||
Power Consumption |
Normal: 7.2W (12V/600mA) Max: 33.6W (12V/2800mA) |
Normal: 7.2W (12V/600mA) Max: 18W (12V/1500mA) |
|
Size |
93.4mm * 93.4mm * 50mm |
||
Weight |
≈ 500g |
||
Environment |
Operating temperature: -20℃~60℃,Storage temperature: -20℃~70℃, Storage Humidity: 10%~90%RH (non-condensing) |
||
Software Support | OS |
The Jetson system, based on Ubuntu 22.04, offers a comprehensive desktop Linux environment with accelerated graphics, supporting libraries such as NVIDIA CUDA, TensorRT, and CuDNN. |
|
Large Models |
Robotic models: ROS models Large language models: The private deployment of ultra-large-scale parameter models under the Transformer architecture, such as LLaMa2, ChatGLM, and Qwen. Vision models: ViT, Grounding DINO, and SAM. AI painting: Stable Diffusion V1.5 image generation model in the AIGC field. |
||
Traditional Network Architectures |
Traditional network architectures such as CNN, RNN, and LSTM; a variety of deep learning frameworks include TensorFlow, PyTorch, MXNet, PaddlePaddle, ONNX, and Darknet; custom operator development. Docker container management technology facilitates easy image deployment. |
||
AI Software Stack |
NVIDIA Jetson Orin offers unparalleled AI compute, large unified memory, and comprehensive software stacks, delivering superior energy efficiency to drive the latest generative AI applications. It’s capable of fast inference for any generative AI models powered by the transformer architecture, providing superior edge performance on MLPerf. |
||
Interfaces | Ethernet |
1×Gigabit Ethernet (1000Mbps / RJ45) |
|
Video Output |
1×HDMI 2.1 (4K@60fps) |
||
USB |
2×USB 3.0 (Max: 1A) |
||
Watchdog |
Support external watchdog |
||
Other |
1×Type-C (USB 2.0 OTG), 1×Console (debug serial port), 1× Recovery, 1× power button |
欢迎反馈问题,您的意见与建议是我们的动力!
Copyright © 2014 - 2023 FIREFLY TECHNOLOGY CO.,LTD | 粤ICP备14022046号-2