Unsolved

This post is more than 5 years old

2 Posts

17090

April 18th, 2015 02:00

Performance issue with R720 and SSD disks

Hello,

we were testing I/O speed and IOPs of DELL R720 with SSD disks but something went wrong.

Our test environment is:

DELL R720 ( 2 x 8 bays ) with following specs:

2 x E2650v2

256GB RAM

1 x PERC H710P Mini with 2 x SSD DELL LB206M in raid1 ( os ).

1 x PERC H710P Adapter 1 with 8 x SSD DELL LB806R in raid10 ( data ).

All components have latest firmware in according to what's publish today ( 18/04/2015 ) on support.dell.com and specifically:

BIOS 2.5.2

PERC 21.3.1-0004

LB206M D327

LB806R D327

During our simple tests we notice that with hdparm, sequential read speed was about 1GB/s.

/dev/sdb:

Timing buffered disk reads: 2930 MB in  3.00 seconds = 976.48 MB/sec

Timing buffered disk reads: 3104 MB in  3.00 seconds = 1034.40 MB/sec

Timing buffered disk reads: 2988 MB in  3.00 seconds = 995.35 MB/sec

Timing buffered disk reads: 3144 MB in  3.00 seconds = 1047.72 MB/sec

After that we want check IOPs. We downloaded a great script for linux to measure IOPs available at:

https://github.com/cxcv/iops

Results were good. Not how i expected in according to SANDISK specs but good enough:

/dev/sdb,   3.20 TB, 32 threads:

 512   B blocks: 48058.3 IO/s,  23.5 MiB/s (196.8 Mbit/s)

   1 KiB blocks: 47649.3 IO/s,  46.5 MiB/s (390.3 Mbit/s)

   2 KiB blocks: 48178.9 IO/s,  94.1 MiB/s (789.4 Mbit/s)

   4 KiB blocks: 48533.7 IO/s, 189.6 MiB/s (  1.6 Gbit/s)

   8 KiB blocks: 46377.6 IO/s, 362.3 MiB/s (  3.0 Gbit/s)

  16 KiB blocks: 37399.7 IO/s, 584.4 MiB/s (  4.9 Gbit/s)

  32 KiB blocks: 25662.6 IO/s, 802.0 MiB/s (  6.7 Gbit/s)

  64 KiB blocks: 16983.0 IO/s,   1.0 GiB/s (  8.9 Gbit/s)

 128 KiB blocks: 10212.7 IO/s,   1.2 GiB/s ( 10.7 Gbit/s)

 256 KiB blocks: 5483.1 IO/s,   1.3 GiB/s ( 11.5 Gbit/s)

 512 KiB blocks: 2686.7 IO/s,   1.3 GiB/s ( 11.3 Gbit/s)

   1 MiB blocks: 1327.8 IO/s,   1.3 GiB/s ( 11.1 Gbit/s)

   2 MiB blocks:  677.9 IO/s,   1.3 GiB/s ( 11.4 Gbit/s)

   4 MiB blocks:  345.1 IO/s,   1.3 GiB/s ( 11.6 Gbit/s)

   8 MiB blocks:  171.9 IO/s,   1.3 GiB/s ( 11.5 Gbit/s)

  16 MiB blocks:   85.0 IO/s,   1.3 GiB/s ( 11.4 Gbit/s)

  32 MiB blocks:   43.3 IO/s,   1.4 GiB/s ( 11.6 Gbit/s)

  64 MiB blocks:   21.4 IO/s,   1.3 GiB/s ( 11.5 Gbit/s)

 

Next we notice that sequential read test with hdparm goes down dramatically.

Now seq read speed, instead of 1.0GB/s REMAINS around 400-500MB/s. Just half speed.

/dev/sdb:

 Timing buffered disk reads: 1394 MB in  3.00 seconds = 464.05 MB/sec

 Timing buffered disk reads: 1590 MB in  3.00 seconds = 529.20 MB/sec

 Timing buffered disk reads: 1518 MB in  3.00 seconds = 505.68 MB/sec

 Timing buffered disk reads: 1490 MB in  3.00 seconds = 496.37 MB/sec

We have to reboot in order to restore previous speed.

I think problem it's related to a bug issue on BIOS or PERC firmware or disk firmware.

That happens when raid controller or BIOS or disk firmware have strong load ( also just for less than a minute ) or simply many operations to do. 

Also i consider this a bug because doing same identical tests on a DELL R510 with PERC H700, this issue does not happen.

Anyway R510 has following components and latest firmware in according to support.dell.com:

R510 ( 12 x 3.5" bays )

2 x L5640

64GB RAM

PERC H700 1GB NV CACHE

2 x LB206M ( os )

8 x LB806R ( data )

In this scenario, also after IOPs test, sequential read still work as a charm while on R720 not.

Waiting for your reply

Best regards

No Responses!

0 events found

No Events found!

Top