Skip to main content
Skip to main content
MACHINE VISION · ISRAEL · AUTHORIZED HIKROBOT INTEGRATION PARTNER

Machine Vision
Engineered to
Sub-Pixel Precision

Authorized HikRobot integrator in Israel. We architect and deploy full vision stacks — Sony IMX sensor selection, telecentric optics, structured lighting, and on-board neural inference — integrated directly into your robot controller or PLC.

±0.01mm
REPEATABILITY
sub-pixel measurement
<16ms
INFERENCE
on-board neural net
5MP
MAX RESOLUTION
SC6000 sensor
GigE·USB3
INTERFACES
GenICam compliant
VISION CAPABILITIES

What Our Vision Systems Actually Do

Each deployment is specified to exact sensor, optics, lighting, and algorithm. No generic "AI-powered" claims — exact camera model, exact tolerance, exact cycle time.

Code Reading & Traceability

HikRobot ID2000/ID5000 fixed readers and IDH7000 handheld — decode 1D, 2D, QR, DPM at up to 60 reads/sec. Omnidirectional decode, tilt ±45°, contrast threshold auto-adapt. Direct OPC-UA output to MES/WMS.

1D · 2D · DPM · Aztec decode at line speed
OPC-UA / MQTT output → MES / WMS / SAP
Serialisation, GS1 compliance, traceability
Label print-and-verify before dispatch
HARDWAREID2000 / ID5000 / IDH7000

Pose Estimation & Robot Guidance

CA series GigE cameras + HikRobot 3D structured-light sensors deliver 6-DoF pose estimation for bin picking, flexible feeding, and visual servoing. Point cloud processing at <100ms. Calibration to robot base frame with sub-mm accuracy.

6-DoF pose estimation for bin picking
Visual servoing — real-time TCP correction
Weld seam detection & path adaptation
Flexible feeding without hard fixtures
HARDWARECA Series + DB500S 3D Sensor

Inline Inspection & Gauging

CL series line-scan cameras (up to 16K resolution) and CA area-scan with telecentric lenses achieve ±5µm dimensional accuracy. Deep-learning segmentation classifies surface defect types with <0.1% false positive rate.

±5µm dimensional gauging — no contact
Deep-learning surface defect segmentation
Pin presence / solder joint verification
100% inline — zero sampling required
HARDWARECL Line Scan · CA Area Scan

On-Board Neural Inference

SC6000 AI smart camera runs ResNet/MobileNet classification and anomaly detection on an embedded NVIDIA module — no external vision PC. Trained on synthetic + real datasets. Updates over-the-air. Achieves <0.05% false reject rate in production.

Anomaly detection — no labelled defect set needed
ResNet / MobileNet classification on-board
Synthetic training data augmentation
<0.05% false reject · <0.01% escape rate
HARDWARESC6000 · NVIDIA Edge Inference
AUTHORIZED HIKROBOT INTEGRATION PARTNER · ISRAEL

Full Vision Stack. One Integration Partner.

HikRobot — 60M+ cameras deployed globally, GenICam-compliant, HALCON/OpenCV SDK. We handle sensor selection, C-mount / telecentric optics, coaxial & structured lighting, GigE / USB3 cabling, and software integration with your FANUC, ABB, or KUKA controller.

±5µm
GAUGE ACCURACY
<16ms
INFERENCE TIME
16K
MAX RESOLUTION
IP67
PROTECTION
INDUSTRIES & TOLERANCES

Specified for Your Environment

Camera protection class, lighting wavelength, and lens working distance are specified per application — IP54 for food lines, IP67 for coolant environments, Class 5 cleanroom for pharma.

Electronics
AOI · ±5µmSolder Joint SPIBGA Inspection
Food & Bev
IP54 WashdownFill Level ±1mmForeign Object
Pharma
GMP / 21 CFRSerialisationBlister Integrity
Automotive
±10µm GaugingWeld Seam AIPose Estimation
Logistics
60 reads/secOPC-UA → WMSVoid Detection
Metalworking
IP67 CoolantSurface Crack AIBurr ±20µm
HIKROBOT SENSOR FAMILIES

Sensor → Optics → Lighting → Software

Sony IMX CMOS sensors · GigE Vision / USB3 Vision · GenICam compliant · HALCON, OpenCV, and ROS2 SDK. We specify the full optical stack, not just the camera.

ENGINEERING PROCESS

Specification-First. No Surprises at Go-Live.

We specify every optical parameter before ordering hardware — sensor size, pixel pitch, working distance, depth of field, lighting angle, and frame rate. This eliminates integration surprises.

Optical Specification

Part geometry, surface finish, contrast targets, cycle time, and reject tolerance → sensor resolution, pixel pitch, lens magnification, lighting wavelength.

System Architecture

Camera model, lens focal length, LED wavelength and angle, controller, I/O map, and software stack specified before any hardware is ordered.

Integration & Calibration

Physical mount, camera-to-robot calibration (hand-eye), vision program, I/O handshake, and HMI. Acceptance test on real parts at your facility.

Validation & SLA

FP/FN measurement report on production samples. 8-hour fault SLA. Remote monitoring. Annual recalibration and model retraining as part of maintenance.

±5µm
GAUGE ACCURACY
<16ms
NEURAL INFERENCE
<0.05%
FALSE REJECT RATE
GET STARTED

Have a Vision Application? Let's Talk to Your Engineer.

Schedule a 30-minute feasibility call with one of our vision engineers. We assess whether the application is solvable with vision, what accuracy is realistic, and what a deployment would involve — before anyone commits to anything.