Thanks for the information again. It will be a month before I can get the new drive. I am definitely going to research this.
Actually you are wrong.
They may be 32bit, but there can be a multiple amount of processes that each can take up to a max of 4gb.
The Page file is necessary for operation for a fail safe, you have been lucky sure, but don't tell others that bad advice.
The page file only adjusts its size due to what you use over time, it builds a profile of your most used applications and from there adjusts the size accordingly.
Now if you have multiple apps running the page file will change it's size again, but nothing is written, it is pre-allocated until it is used.
Now there is still apps out there that request page file usage. Turning it off doesn't increase your PC performance, that is a myth also. Having it on doesn't decrease your PC performance.
But as said it is needed, and I suggest not believing the multitudes of so called experts from overclocking forums etc.
Also I never said to keep the page file on an SSD either.
But I would like to see your source on why disabling a page file is fine, from a legitimate source.
Also 8GB is fine these days sure, but more titles are 64bit compatible, and many future titles will be also.
Witcher 3 and Batman AK are both 64bit only. So what about people with 8GB RAM, a decent GPU, should they disable their page file while playing Witcher 3 or Batman AK in the near future?
Cause crash central if they do.
Even if you open multiple browsers with multiple tabs each you'll find it very hard to get to 4gb, try that and get back to me.
Also I never said disabling page file will increase performance but from my exp it 100% does! certain games load and open faster because windows does use page file at it's own consideration to each and every app and disabling it forces win to use RAM which is much more faster.
Like I said if you have 8gb I would'nt recommend disabling it but if you have 16gb which is almost a standard these days then you're perfectly fine.
You want a legitimate source? I'm your source.
Lets say im not exactly the average pc user, I ran multiple virtual machines while browsing and played each game you can imagine, also I edit huge video files, never ever had a single hiccup or crash, and you can take that to the bank.
Edit: Btw if you want another source Neo7 is your man, I remember we discussed about it over here few years ago and he was the one who recomended me to disable the page file because he had good exp with it and I did. It turned out to be the right choice.
But you can take it or leave it I'm not arguing with you for me it worked brilliantly.
The main thing for my friend is that the file wont sit on his ssd, if he decides to disable it or not is completely negligible.
[Edited by kingkob, 6/15/2015 5:00:12 PM]
Back then 4 GB was the max that any system would ever carry and was considered the pinnacle of huge memory numbers. The minimal RAM needed to power XP was 64 MB and the most common configuration being 0.5 GB.
The purpose of the page file was one of two things:
1. Store processes that have been idle for a long time.
2. Emergency memory management when needed (you'd see a warning balloon and a threat of processes being terminated by force).
Back then, the SSD or the notion of flash based storage had not been invented yet. It was the era of WD Black Edition 7200 RPM or GTFO. Even then it still slowed down things since RAM is faster than a mechanical hard drive. This is where the notion of disabling the page file came in.
If a user had enough RAM, they could in theory turn the page file off and rely purely on RAM alone. The speed boost gained comes from not having to wait on the mechanical hard drive doing it's thing if there was a need for it. This came with some caveats:
1. If your computer screen BSODs for any reason, you risked losing troubleshooting information in the logs since RAM is volatile memory.
2. If you reached your physical RAM limit, your computer froze. No error, no logging, no recovery. This was hit a few times by power users but none really ever mentioned this.
Before we jump to the Vista era and discuss SuperFetch and it's capabilities (as mentioned by DAB) we need to discuss Prefetcher first. Prefetcher is an XP technology and the precursor to SuperFetch. It basically takes critical files in XP and preloads them into RAM. This allows for some faster functionality for the base OS and allows faster boots. But we come back to the page file in that if the RAM is preoccupied and new programs get loaded in, the OS has a choice:
1. Send the prefetched data into page file
2. Rely on the page file to pick up the slack.
Either will see lag due to the limits of the hardware. Part of why disabling the page file made things faster back then was because of this.
Now lets fast forward to the Vista era and briefly touch on SuperFetch. This is a new service that is still in Windows to this day and essentially does what Prefetcher does but also intelligently monitors the user's favorite processes and opts to preload way more selectively. This improves efficiency greatly. Against it's not something that everyone embraced and turned this off in favor of small performance boosts (again relates to that not enough RAM, rely on the page file scenario). Most default scenarios have Windows manage this and the page file manually but I believe, they can be configured to have hard set bounds if you know what you're doing (not recommended unless you have a very specific reason for doing so).
So in this day and age is it recommended? I don't recommend this anymore due to the evolution of the OS and programs. The system I primarily uses 24 GB of RAM (used to be 32 but one stick crapped out on me). Even though I believe have far more memory than I'll ever have need for the page file, I see no reason to turn it off for two reasons:
1. My hard drive is now SSD technology. The amount of speed you gain from RAM vs SSD is negligible. Even if not on an SSD, the slowdowns caused by page file only happens when it's needed for emergency memory management.
2. Remember that prefetching stated before (and touched upon by DAB)? This is effectively an index and behaves like an index. Fetching new data from an index is always more efficient than fetching from source. My task manager tells me I have 5.5 GB of cached data in my page file (which is a lot of programs).
Ultimately the disabling page file technique from the old days just helped to mask the true underlying problem: If you saw a lot of performance gains then the issue was that you did not have enough RAM (and the actual solution was to buy more/better RAM). There was the x64 edition of Windows XP for enthusiasts to play around with.
I have brought workstations that have had 16 and even 32 GB of RAM to it's knees with relative ease through web application development extremely easy (the upkeep of a dev sandbox is pretty high). Most virtual hypervisors are smart in allocating the RAM that is needed at the time dynamically. Once you actually start using all the resources is when you start seeing problem (and the most popular weapon for taking up all the resources is the JVM brought to you by Java).
Just one more note: Chrome has a negative association with using way more resource usage than is necessary (and seems to becoming a theme with things Google creates. See Android 5.0.1). It's pretty easy to send that browser beyond 3-4GBs. On my machine it takes about 2 GB on Chrome before I start seeing issues with slowdown. Microsoft Edge (currently daily driver) is a 64-bit app and had (still has?) a really nasty memory leak. It took at least 5-6 GB before I started noticing slowdown and 10+ GB before the browser was nearly unusable.
That is all I have on the subject. Lot of people seem to regurgitate what is said based on personal experiences but that is not enough for this field. You need to understand what the technology does and why things are considered good/bad in order to wield it properly. That is a major fault I see with a lot of power users. I've had some really crap computers in the past to get by (as in needed to use Caliber's trainers to get passed some stuff because the framerates were so low at some points that it was impossible to do stuff without cheating). That's probably why I had a positive experience (although not by much if memory serves).
[Edited by Neo7, 6/15/2015 6:09:57 PM]
1. Virtually any browser you use while under Linux
2. Chromium / Chrome (August 2014)
3. IE (since the release of Windows x64 but made default by Windows 8/8.1).
4. Safari (Mac Only)
5. Opera (one of the first to pioneer. Now naturally since it's Chromium based)
6. Microsoft Edge (This is actually a WinRT based browser which are 64 bit)
That really leaves Firefox lagging behind of the main browsers.
If you are an average/midrange power user you should be fine and even better with page file disabled.
I had page file diabled on two high end machines they both worked unblemished.
[Edited by kingkob, 6/15/2015 6:31:51 PM]
It's ok to change your mind: Link
Lol even Dab spoke differently, sometimes I wonder if people post just for the sake of argument.
I recommend to disable it.
[Edited by kingkob, 6/15/2015 6:37:30 PM]
* Updated game trainers and cheats daily
* Get notified when new cheats are added
* Request which games get new trainers
* Priority support with any problem