4.8 ffs_dirpref problem
Ken Marx
kmarx at sploot.vicor-nb.com
Fri Oct 31 18:57:11 PST 2003
On Thu, Oct 30, 2003 at 02:35:41PM -0800, Kirk McKusick wrote:
> I know it takes a lot of time, but I would like to hear of the results
> when you do the initial loading of the filesystem using Don's code as
> that may well effect the set of choices that it has.
>
> Kirk McKusick
>
Just followup:
I emailed this morning on re-doing from scratch newfs with
afore-mentined overrides for avg filesperdir and filesize.
I've now run after newfs'ing with no overrides. This
with Don's patch and our hashtable patch. The results
are pretty similar to before.
Things march along at about 64-5sec/1.5gb untar, until the
low 90%'s. There it bogs down a bit. This time peak times
are a bit higher, but still acceptable.
That's nice for us, as we don't have to re-newfs our
production disks to take advantage of Don's patch.
Below are cut/paste from the end of both runs.
# tunefs: average file size: (-f) 49152
# tunefs: average number of files in a directory: (-s) 1500
---------------------------------------------------------------------
/dev/da0s1e 558889580 441340370 72838044 86% 6080003 64094715 9% /raid
154.11 real 1.67 user 63.63 sys
/dev/da0s1e 558889580 442867498 71310916 86% 6101041 64073677 9% /raid
181.47 real 1.55 user 62.14 sys
/dev/da0s1e 558889580 444394628 69783786 86% 6122079 64052639 9% /raid
155.28 real 1.36 user 63.30 sys
/dev/da0s1e 558889580 445921756 68256658 87% 6143117 64031601 9% /raid
170.80 real 1.65 user 63.68 sys
/dev/da0s1e 558889580 447448884 66729530 87% 6164155 64010563 9% /raid
121.58 real 1.84 user 65.67 sys
/dev/da0s1e 558889580 448976012 65202402 87% 6185193 63989525 9% /raid
124.90 real 1.53 user 66.31 sys
/dev/da0s1e 558889580 450503140 63675274 88% 6206231 63968487 9% /raid
126.03 real 1.48 user 66.31 sys
/dev/da0s1e 558889580 452030268 62148146 88% 6227269 63947449 9% /raid
129.77 real 1.52 user 66.29 sys
/dev/da0s1e 558889580 453557398 60621016 88% 6248307 63926411 9% /raid
124.69 real 1.49 user 67.11 sys
/dev/da0s1e 558889580 455084526 59093888 89% 6269345 63905373 9% /raid
137.29 real 1.46 user 68.44 sys
/dev/da0s1e 558889580 456611654 57566760 89% 6290383 63884335 9% /raid
126.54 real 1.84 user 66.92 sys
/dev/da0s1e 558889580 458138782 56039632 89% 6311421 63863297 9% /raid
140.16 real 1.49 user 69.78 sys
/dev/da0s1e 558889580 459665910 54512504 89% 6332459 63842259 9% /raid
139.00 real 1.68 user 72.60 sys
/dev/da0s1e 558889580 461193038 52985376 90% 6353497 63821221 9% /raid
143.07 real 1.76 user 78.86 sys
/dev/da0s1e 558889580 462720168 51458246 90% 6374535 63800183 9% /raid
164.61 real 1.57 user 94.91 sys
/dev/da0s1e 558889580 464247296 49931118 90% 6395573 63779145 9% /raid
162.26 real 1.77 user 107.84 sys
/dev/da0s1e 558889580 465774424 48403990 91% 6416611 63758107 9% /raid
169.80 real 1.64 user 100.17 sys
/dev/da0s1e 558889580 467301552 46876862 91% 6437649 63737069 9% /raid
170.83 real 1.70 user 80.28 sys
/dev/da0s1e 558889580 468828680 45349734 91% 6458687 63716031 9% /raid
134.12 real 1.68 user 66.30 sys
/dev/da0s1e 558889580 470355808 43822606 91% 6479725 63694993 9% /raid
134.38 real 1.92 user 65.51 sys
/dev/da0s1e 558889580 471882938 42295476 92% 6500763 63673955 9% /raid
119.38 real 1.41 user 66.21 sys
/dev/da0s1e 558889580 473410066 40768348 92% 6521801 63652917 9% /raid
123.21 real 1.75 user 66.18 sys
/dev/da0s1e 558889580 474937194 39241220 92% 6542839 63631879 9% /raid
125.09 real 1.75 user 66.63 sys
/dev/da0s1e 558889580 476464322 37714092 93% 6563877 63610841 9% /raid
129.32 real 1.65 user 67.14 sys
/dev/da0s1e 558889580 477991452 36186962 93% 6584916 63589802 9% /raid
129.29 real 1.54 user 68.44 sys
/dev/da0s1e 558889580 479518580 34659834 93% 6605954 63568764 9% /raid
147.42 real 1.50 user 81.64 sys
/dev/da0s1e 558889580 481045710 33132704 94% 6626992 63547726 9% /raid
149.96 real 1.49 user 74.04 sys
/dev/da0s1e 558889580 482572838 31605576 94% 6648030 63526688 9% /raid
175.23 real 1.97 user 101.63 sys
/dev/da0s1e 558889580 484099966 30078448 94% 6669068 63505650 10% /raid
182.27 real 1.79 user 115.61 sys
/dev/da0s1e 558889580 485627094 28551320 94% 6690106 63484612 10% /raid
134.85 real 1.44 user 77.27 sys
/dev/da0s1e 558889580 487154222 27024192 95% 6711144 63463574 10% /raid
208.96 real 1.63 user 91.81 sys
/dev/da0s1e 558889580 488681350 25497064 95% 6732182 63442536 10% /raid
148.43 real 1.74 user 111.99 sys
/dev/da0s1e 558889580 490208480 23969934 95% 6753220 63421498 10% /raid
151.99 real 1.51 user 115.23 sys
/dev/da0s1e 558889580 491735608 22442806 96% 6774258 63400460 10% /raid
146.03 real 1.76 user 109.72 sys
/dev/da0s1e 558889580 493262736 20915678 96% 6795296 63379422 10% /raid
171.04 real 1.67 user 132.48 sys
/dev/da0s1e 558889580 494789864 19388550 96% 6816334 63358384 10% /raid
144.08 real 1.61 user 107.65 sys
/dev/da0s1e 558889580 496316992 17861422 97% 6837372 63337346 10% /raid
149.30 real 1.75 user 112.30 sys
/dev/da0s1e 558889580 497844120 16334294 97% 6858410 63316308 10% /raid
147.50 real 1.71 user 114.39 sys
/dev/da0s1e 558889580 499371248 14807166 97% 6879448 63295270 10% /raid
153.79 real 1.65 user 118.08 sys
/dev/da0s1e 558889580 500898378 13280036 97% 6900486 63274232 10% /raid
144.62 real 1.71 user 109.23 sys
/dev/da0s1e 558889580 502425506 11752908 98% 6921524 63253194 10% /raid
134.38 real 1.33 user 98.33 sys
/dev/da0s1e 558889580 503952634 10225780 98% 6942562 63232156 10% /raid
106.89 real 1.44 user 71.75 sys
/dev/da0s1e 558889580 505479762 8698652 98% 6963600 63211118 10% /raid
138.70 real 1.50 user 103.18 sys
/dev/da0s1e 558889580 507006890 7171524 99% 6984638 63190080 10% /raid
106.71 real 1.31 user 67.53 sys
/dev/da0s1e 558889580 508534020 5644394 99% 7005676 63169042 10% /raid
112.68 real 1.41 user 72.57 sys
/dev/da0s1e 558889580 510061148 4117266 99% 7026714 63148004 10% /raid
146.53 real 1.41 user 101.87 sys
/dev/da0s1e 558889580 511588276 2590138 99% 7047752 63126966 10% /raid
134.13 real 1.61 user 95.51 sys
/dev/da0s1e 558889580 513115404 1063010 100% 7068790 63105928 10% /raid
total: 38098.69 real 492.64 user 22852.88 sys
# tunefs: average file size: (-f) 16384
# tunefs: average number of files in a directory: (-s) 64
---------------------------------------------------------------------
/dev/da0s1e 558889580 456611290 57567124 89% 6290373 63884345 9% /raid
110.60 real 1.39 user 64.17 sys
/dev/da0s1e 558889580 458138418 56039996 89% 6311411 63863307 9% /raid
112.72 real 1.45 user 64.43 sys
/dev/da0s1e 558889580 459665546 54512868 89% 6332449 63842269 9% /raid
111.78 real 1.29 user 65.39 sys
/dev/da0s1e 558889580 461192674 52985740 90% 6353487 63821231 9% /raid
114.09 real 1.38 user 65.11 sys
/dev/da0s1e 558889580 462719802 51458612 90% 6374525 63800193 9% /raid
112.38 real 1.39 user 65.89 sys
/dev/da0s1e 558889580 464246930 49931484 90% 6395563 63779155 9% /raid
108.89 real 1.35 user 65.48 sys
/dev/da0s1e 558889580 465774058 48404356 91% 6416601 63758117 9% /raid
107.14 real 1.31 user 66.02 sys
/dev/da0s1e 558889580 467301186 46877228 91% 6437639 63737079 9% /raid
110.17 real 1.35 user 66.06 sys
/dev/da0s1e 558889580 468828314 45350100 91% 6458677 63716041 9% /raid
103.04 real 1.35 user 66.41 sys
/dev/da0s1e 558889580 470355442 43822972 91% 6479715 63695003 9% /raid
103.65 real 1.33 user 67.43 sys
/dev/da0s1e 558889580 471882570 42295844 92% 6500753 63673965 9% /raid
111.54 real 1.39 user 67.63 sys
/dev/da0s1e 558889580 473409698 40768716 92% 6521791 63652927 9% /raid
119.68 real 1.49 user 70.97 sys
/dev/da0s1e 558889580 474936826 39241588 92% 6542829 63631889 9% /raid
117.99 real 1.43 user 72.54 sys
/dev/da0s1e 558889580 476463954 37714460 93% 6563867 63610851 9% /raid
124.48 real 1.33 user 82.17 sys
/dev/da0s1e 558889580 477991084 36187330 93% 6584906 63589812 9% /raid
128.24 real 1.53 user 79.34 sys
/dev/da0s1e 558889580 479518212 34660202 93% 6605944 63568774 9% /raid
133.42 real 1.73 user 94.41 sys
/dev/da0s1e 558889580 481045340 33133074 94% 6626982 63547736 9% /raid
145.93 real 1.65 user 101.37 sys
/dev/da0s1e 558889580 482572468 31605946 94% 6648020 63526698 9% /raid
135.85 real 1.60 user 84.61 sys
/dev/da0s1e 558889580 484099596 30078818 94% 6669058 63505660 10% /raid
115.91 real 1.58 user 75.65 sys
/dev/da0s1e 558889580 485626724 28551690 94% 6690096 63484622 10% /raid
160.66 real 1.64 user 118.11 sys
/dev/da0s1e 558889580 487153852 27024562 95% 6711134 63463584 10% /raid
157.99 real 1.55 user 117.62 sys
/dev/da0s1e 558889580 488680980 25497434 95% 6732172 63442546 10% /raid
171.06 real 1.63 user 127.95 sys
/dev/da0s1e 558889580 490208108 23970306 95% 6753210 63421508 10% /raid
178.18 real 1.80 user 131.84 sys
/dev/da0s1e 558889580 491735236 22443178 96% 6774248 63400470 10% /raid
167.70 real 1.64 user 113.13 sys
/dev/da0s1e 558889580 493262364 20916050 96% 6795286 63379432 10% /raid
214.04 real 1.70 user 154.12 sys
/dev/da0s1e 558889580 494789492 19388922 96% 6816324 63358394 10% /raid
191.70 real 1.74 user 135.61 sys
/dev/da0s1e 558889580 496316620 17861794 97% 6837362 63337356 10% /raid
199.92 real 1.90 user 157.57 sys
/dev/da0s1e 558889580 497843748 16334666 97% 6858400 63316318 10% /raid
281.91 real 2.00 user 192.30 sys
/dev/da0s1e 558889580 499370876 14807538 97% 6879438 63295280 10% /raid
198.64 real 1.57 user 144.68 sys
/dev/da0s1e 558889580 500898004 13280410 97% 6900476 63274242 10% /raid
271.23 real 1.97 user 213.29 sys
/dev/da0s1e 558889580 502425132 11753282 98% 6921514 63253204 10% /raid
224.62 real 1.75 user 160.66 sys
/dev/da0s1e 558889580 503952260 10226154 98% 6942552 63232166 10% /raid
233.05 real 1.65 user 168.42 sys
/dev/da0s1e 558889580 505479388 8699026 98% 6963590 63211128 10% /raid
217.44 real 2.09 user 160.40 sys
/dev/da0s1e 558889580 507006516 7171898 99% 6984628 63190090 10% /raid
284.76 real 1.97 user 219.23 sys
/dev/da0s1e 558889580 508533644 5644770 99% 7005666 63169052 10% /raid
234.51 real 1.67 user 170.36 sys
/dev/da0s1e 558889580 510060772 4117642 99% 7026704 63148014 10% /raid
236.64 real 2.00 user 173.65 sys
/dev/da0s1e 558889580 511587900 2590514 99% 7047742 63126976 10% /raid
518.78 real 1.86 user 411.95 sys
/dev/da0s1e 558889580 513115028 1063386 100% 7068780 63105938 10% /raid
total: 39100.60 real 483.88 user 23886.66 sys
--
Ken Marx, kmarx at vicor-nb.com
Agree. Agree. But still we have to right size and step upto the plate on the
milestones.
- http://www.bigshed.com/cgi-bin/speak.cgi
More information about the freebsd-fs
mailing list