Computing desk | ||
---|---|---|
< December 20 | << Nov | December | Jan >> | December 22 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
I'm using what is basically a large hash table. I'm trying to get as much data as possible into a Windows computer with 128GB of RAM. I'm using Delphi. The number of rows (the hash values) is fixed but the number of items for each hash varies from 0 to over 65000 (and is not known in advance).
My first approach was to use a list of lists. But when I added a new item to the list, it seemed to double the amount of memory allocated, which got to taking up too much memory.
I switched to a dynamic array of dynamic arrays, with the idea of when I needed more data for a particular hash, I'd increase the corresponding array by a certain percentage (I tried 10%, 20%, and 25%). But initially it seems to be using a lot more memory than expected. In the test I'm running now, (has been running for a few hours and will finish overnight), the memory use according to task manager got up to 97%, 98%, and then 99%. It is constantly adding data as it runs. But after a while, the memory use dropped to 83%, then to 70%, and now it is at 62% (even though it is rapidly adding data).
So what I'm wondering is, when I increase the size of a dynamic array by a few percent, is it actually allocating double that (the way it does with lists), and then as it runs, the system is smart enough to realize that it isn't using a lot of this memory and releasing it? Bubba73 You talkin' to me? 04:04, 21 December 2018 (UTC)
Rent time on Google Compute Engine or one of its competitors?
Buy a used 256GB server for a couple of thousand bucks? https://www.amazon.com/gp/offer-listing/B075XR4G18/
-- Guy Macon ( talk) 00:09, 22 December 2018 (UTC)
Computing desk | ||
---|---|---|
< December 20 | << Nov | December | Jan >> | December 22 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
I'm using what is basically a large hash table. I'm trying to get as much data as possible into a Windows computer with 128GB of RAM. I'm using Delphi. The number of rows (the hash values) is fixed but the number of items for each hash varies from 0 to over 65000 (and is not known in advance).
My first approach was to use a list of lists. But when I added a new item to the list, it seemed to double the amount of memory allocated, which got to taking up too much memory.
I switched to a dynamic array of dynamic arrays, with the idea of when I needed more data for a particular hash, I'd increase the corresponding array by a certain percentage (I tried 10%, 20%, and 25%). But initially it seems to be using a lot more memory than expected. In the test I'm running now, (has been running for a few hours and will finish overnight), the memory use according to task manager got up to 97%, 98%, and then 99%. It is constantly adding data as it runs. But after a while, the memory use dropped to 83%, then to 70%, and now it is at 62% (even though it is rapidly adding data).
So what I'm wondering is, when I increase the size of a dynamic array by a few percent, is it actually allocating double that (the way it does with lists), and then as it runs, the system is smart enough to realize that it isn't using a lot of this memory and releasing it? Bubba73 You talkin' to me? 04:04, 21 December 2018 (UTC)
Rent time on Google Compute Engine or one of its competitors?
Buy a used 256GB server for a couple of thousand bucks? https://www.amazon.com/gp/offer-listing/B075XR4G18/
-- Guy Macon ( talk) 00:09, 22 December 2018 (UTC)