PCI-Express host interface specification v3.0 x8 and higher Complete with PCIe2.0 and 1.1 ; Connecter interface: Open Compute Project Intel Motherboard spec V2.0.
PCI-Express host interface specification v3.0 x8 and higher Complete with PCIe2.0 and 1.1 ; Connecter interface: Open Compute Project Intel Motherboard spec V2.0.
Enables low latency RDMA over Ethernet (supported with both non-virtualized and SR-IOV enable virtualized servers)—latency as low as 1us
TCP/UDP/IP stateless offload in hardware ; Traffic steering across multiple cores ; Intelligent interrupt coalescence.
System Requirements : FreeBSD, Linux , VMWare ESXi, Win Server2008 R2/ Win Server2012 R2/ Win Server2016/Win Server2019, Windows: 7/8/8.1/10 32/64bit, One available Open Compute Project Intel Motherboard slot
Description
Product Description
HINYSENO OCP2.0 Mellanox Connectx-3 Dual-Port 10G SFP+ adapter with PCI Express 3.0 deliver high-bandwidth and industry-leading Ethernet connectivity for performance-driven server and storage applications in Enterprise Data Centers, High-Performance Computing, and Embedded environments. Clustered databases, web infrastructure, and high frequency trading are just a few applications that will achieve significant throughput and latency improvements resulting in faster access, real-time response and more users per server. ConnectX-3 EN improves network performance by increasing available bandwidth while decreasing the associated transport load on the CPU especially in virtualized server environments.
Specification
Intelligent interrupt coalescence
Industry-leading throughput and latency performance
Software compatible with standard TCP/UDP/IP stacks
Legacy and UEFI PXE network boot support
Supports iSCSI as a software iSCSI initiator in NIC mode with NIC driver
Supports Operation Systems: FreeBSD, Linux5.x and above