I am having a similar issue to this person. The primary difference being the application is NOT meant for a developer environment, and therefore I need to know how to optimize the space used by Sql Server (possibly per machine based on specs).
I was intrigued by Ricardo C's answer, particularly the following:
Extracted fromt he SQL Server documentation:
Maximum server memory (in MB)
Specifies the maximum amount of memory SQL Server can allocate when it starts and while it runs. This configuration option can be set to a specific value if you know there are multiple applications running at the same time as SQL Server and you want to guarantee that these applications have sufficient memory to run. If these other applications, such as Web or e-mail servers, request memory only as needed, then do not set the option, because SQL Server will release memory to them as needed. However, applications often use whatever memory is available when they start and do not request more if needed. If an application that behaves in this manner runs on the same computer at the same time as SQL Server, set the option to a value that guarantees that the memory required by the application is not allocated by SQL Server.
My question is: how does an application request memory from the OS when it needs it? Is this something built into compilation or something managed by the developer? The two primary apps running on this machine are Sql Server and the (fairly heavyweight) C# application I'm developing, and I'm almost certain we didn't specifically do anything in the realm of asking the OS for memory. Is there a correct/necessary way to do this?