SQL Server has an upper memory limit in the Server Properties. This is the maximum amount of memory that SQL Server database engine can use. Just the database engine. It does not include SSRS, SSIS or SSAS. This maximum amount of memory is constantly used by SQL Server database engine. If we set the upper limit to 90 GB, SQL Server DB engine will use all 90 GB, all the time. Not some of the time, but all of the time. SQL Server DB engine will use 90 GB every minute of the day.
Many people put SSIS into the same box as the SQL Server databases. This is fine, provided that SSIS is given enough amount of memory. If there the box has 96 GB, and the maximum amount made available to SQL Server is 90 GB, then only 6 GB is available for all other things, including Windows processes, antivirus, SSIS, SSRS and SSAS. In this case SSIS would be running very slowly, like 100 times slower that how it should be. Because effectively only a few MB is available to SSIS (depending on what other applications/processes are doing).
If the SQL Server box has 96 GB and the max amount is already set to 90 GB, and we put SSIS on the box, that 90 GB should be lowered to around 70 GB, so that there are 26 GB available for SSIS, Windows and other processes. We then look at the amount of memory used by SSIS engine (in the task manager, or use the Profiler). Generally speaking, SSIS would be happy if it has 16 GB available to it. Very big packages could require 32 GB, for example packages which move billion of rows.
Take 5 GB or so for Windows processes, antivirus and other operating system. Assuming there is no SSRS or SSAS on the box, then we can calculate that out of that 26 GB, 21 GB is available for SSIS to use. If there is SSRS on the box, take 5 to 10 GB for SSRS. If there is SSAS on the box, it is a different story, as SSAS tends to grab all remaining available memory. For this reason we should be putting SSAS in its own box/VM.
With 21 GB SSIS should be able to perform near its peak performance. If the package processes up to 1 million rows, there will be no problem. If 5 packages running simultaneously processing up to 1m rows, the should be no problem. Unless the package is written in a very efficient way of course. Because SSIS will automatically batch it in chunks of 10,000 rows. So “normal size” packages should be running ok in SSIS with near peak performance with 16-21 GB memory (based on my experience).
What if you only have 64 GB on the box? Then allocate 43 for SQL Server (which should be enough for most queries or SQL operations, for most database sizes), giving 21 GB free memory for all other processes. Taking 5 GB for Windows, leaving 16 GB for SSIS (assuming there is no SSRS on the box).
What if you only have 32 GB on the box and you want SQL Server and SSIS running parallel? Give 16 GB to SQL Server, leaving 16 GB for others. Minus 5 GB for Windows and other O/S related, this gives 11 GB to SSIS. Not ideal, but that’s the best we can set for both SSIS and SQL Server, without knowing the details of the SSIS packages. Once we know the detail work flow and tasks then we can be more precise.
I’m worry if there is less than 4 GB for SSIS (after taking out 5 GB for Windows process, antivirus and other applications, and after taking out 5 GB for SSRS). 8 GB is the bare minimum I would recommend as memory for the SSIS, for most packages. But my default recommendation for most packages would be 16 GB (that’s free memory just for SSIS, after taking out Windows and SSRS).
You can of course measure it on your dev box using profiler and check the peak memory requirements, when the packages are being run in SQL server. But this takes time (1-2 weeks), in order to properly measure up the collective requirements of all packages running in parallel.