Cockpit Intelligence

NXP i.MX8 NyxCore · BAQEN

The operator's complete picture.

Every autonomous journey begins with the operator. Before any vehicle can make decisions independently, the human in the cab needs complete situational awareness — vehicle systems, sensor status, navigation, and field conditions unified in a single ruggedized interface.

NyxCore, running on NXP i.MX8 silicon, is the compute foundation of Layer 01. BAQEN, Tarhund's ruggedized vehicle command terminal, is the interface layer — a low-latency touchscreen display that integrates navigation, vehicle subsystem status, live sensor feeds, and operational data into one field-ready screen.

Key Capabilities
  • Real-time navigation and field mapping
  • Vehicle subsystem status and health monitoring
  • Live sensor data visualization
  • Camera feed integration
  • CAN / CAN-FD, Ethernet and wireless connectivity
  • Containerized software stack — adaptable to any vehicle architecture
Available · Deployed
BAQEN touchscreen terminal mounted in a heavy machinery cab at dusk
Autonomous tractor in a field with LiDAR point-cloud sensor visualization

Sensor Fusion Engine

AMD-Xilinx XedCore

The machine's senses.

Autonomous decisions are only as good as the environmental data feeding them. A vehicle that cannot perceive its surroundings with precision and speed cannot act on them with confidence.

XedCore processes LiDAR, RADAR, and inertial sensor streams in parallel at the hardware level — not sequentially in software. Built on AMD-Xilinx FPGA/SoC architecture, XedCore delivers fused environmental intelligence with deterministic latency that conventional MCU or software-based approaches cannot guarantee.

In unstructured environments — fields, construction sites, unpaved terrain — where conditions change faster than any sequential processing pipeline can track, hardware-level parallelism is not a performance advantage. It is a safety requirement.

Key Capabilities
  • Simultaneous parallel processing of LiDAR, RADAR and IMU data streams
  • Deterministic low-latency sensor fusion — hardware guaranteed
  • Environment-agnostic operation — no map dependency
  • Real-time output feed to Layer 03 AI inference engine
  • Configurable sensor interface for platform-specific sensor suites
Design Complete

Autonomous AI Core

NVIDIA

From perception to decision.

The final layer closes the loop between sensing and action. Once the environment is mapped and fused by Layer 02, the Autonomous AI Core interprets that data — identifying obstacles, reading terrain, understanding operational context, and issuing control decisions — entirely on-device, without cloud dependency.

Built on NVIDIA's embedded AI inference platform, this layer is designed for the specific demands of off-road autonomy: unstructured environments, intermittent or absent connectivity, and operating conditions that no pre-loaded map covers. The system sees, interprets, and decides within the machine itself.

Functional scope includes surround awareness, obstacle detection and avoidance, terrain classification, and operational decision-making with a learning capability roadmap for continuous field improvement.

In Development
Autonomous tractor under a night sky with an AI decision-tree visualization overhead

Your platform, your pace.

Cockpit
Intelligence
Sensor Fusion
Engine
Autonomous
AI Core

A fleet operator may begin with Layer 01 alone — giving operators complete situational awareness before any automation is introduced.

A manufacturer with existing sensor hardware may integrate Layer 02 to add real-time environmental intelligence to a platform that already has a display system.

A program targeting full autonomy adopts all three layers as a complete stack.

There is no prescribed sequence. The architecture scales to the ambition of the program.

From vision to deployment.

Most embedded design challenges begin not with a technical specification but with a vision: a vehicle that works without a driver, a machine that understands its environment, a platform that makes decisions in the field.

Tarhund's design services begin at that vision. We work with customers and partners during the requirement definition phase — translating operational goals into complete systems engineering documentation before a single component is specified.

From that foundation, we execute: custom hardware design, board bring-up, BSP development, application software, and full system integration. The result is not a reference design adapted to your platform. It is an embedded system engineered specifically for your application, from the first requirement to the final field test.

We Work With

Ready to discuss your platform?

Whether you have a detailed specification or an early-stage vision, we start where you are.

Discuss your requirements →