Dell Brocade 825 Brocade Adapters Troubleshooting Guide - Page 142

Jumbo packet size, NetQueue, Brocade Adapters Installation and Reference Manual

Page 142 highlights

4 Tuning network drivers (CNA or NIC) Jumbo packet size Recommendations to enhance performance Increase throughput by setting MTU to 9000 bytes. How to change values Refer to instructions for Windows "network driver parameters" in "Adapter Configuration" appendix of the Brocade Adapters Installation and Reference Manual. References for more tuning information Refer to the 10Gbps Networking Performance on ESX 3.5 Update 1 available through www.vmware.com. NetQueue NetQueue improves receive-side networking performance on servers in 10 Gigabit Ethernet virtualized environments. NetQueue provides multiple receive queues on the CNA or Fabric Adapter port configured in CNA mode, which allows processing on multiple CPUs to improve network performance. MSI-X is an eXtended version of Message Signaled Interrupts defined in the PCI 3.0 specification. All Brocade adapters support MSI-X, which helps improve overall system performance by contributing to lower interrupt latency and improved host CPU utilization. MSI-X is enabled by default in VMware ESX Server, and must remain enabled for NetQueue to function. Please make sure that bnad_msix=0 is not listed in VMware module parameters because that would disable NetQueue. For the Brocade driver, you cannot directly configure the number of NetQueue and filters per NetQueue. By default, these values are based on the number of receive queue sets used, which is calculated from the number of CPUs in the system. Default value: Disable Possible values: Enable, Disable Recommendations to enhance performance Enabling NetQueue utilizes multiple receive queues of the Brocade adapter, which can be handled by multiple CPUs on the host system, thus improving performance. How to change values Refer to instructions for Windows "network driver parameters" in "Adapter Configuration" appendix of the Brocade Adapters Installation and Reference Manual. References for more tuning information Refer to the 10Gbps Networking Performance on ESX 3.5 Update 1 available through www.vmware.com. 118 Brocade Adapters Troubleshooting Guide 53-1002145-01

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172

118
Brocade Adapters Troubleshooting Guide
53-1002145-01
Tuning network drivers (CNA or NIC)
4
Jumbo packet size
Recommendations to enhance performance
Increase throughput by setting MTU to 9000 bytes.
How to change values
Refer to instructions for Windows “network driver parameters” in “Adapter Configuration” appendix
of the
Brocade Adapters Installation and Reference Manual
.
References for more tuning information
Refer to the
10Gbps Networking Performance on ESX 3.5 Update 1
available through
www.vmware.com
.
NetQueue
NetQueue improves receive-side networking performance on servers in 10 Gigabit Ethernet
virtualized environments. NetQueue provides multiple receive queues on the CNA or Fabric Adapter
port configured in CNA mode, which allows processing on multiple CPUs to improve network
performance.
MSI-X is an eXtended version of Message Signaled Interrupts defined in the PCI 3.0 specification.
All Brocade adapters support MSI-X, which helps improve overall system performance by
contributing to lower interrupt latency and improved host CPU utilization. MSI-X is enabled by
default in VMware ESX Server, and must remain enabled for NetQueue to function. Please make
sure that bnad_msix=0 is not listed in VMware module parameters because that would disable
NetQueue.
For the Brocade driver, you cannot directly configure the number of NetQueue and filters per
NetQueue. By default, these values are based on the number of receive queue sets used, which is
calculated from the number of CPUs in the system.
Default value:
Disable
Possible values:
Enable, Disable
Recommendations to enhance performance
Enabling NetQueue utilizes multiple receive queues of the Brocade adapter, which can be handled
by multiple CPUs on the host system, thus improving performance.
How to change values
Refer to instructions for Windows “network driver parameters” in “Adapter Configuration” appendix
of the
Brocade Adapters Installation and Reference Manual
.
References for more tuning information
Refer to the
10Gbps Networking Performance on ESX 3.5 Update 1
available through
www.vmware.com
.