You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: apps/site/pages/en/learn/diagnostics/memory/understanding-and-tuning-memory.md
+28-22Lines changed: 28 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,28 +6,32 @@ authors: avivkeller
6
6
7
7
# Understanding and Tuning Memory
8
8
9
-
Node.js, built on Google’s V8 JavaScript engine, offers a powerful runtime for running JavaScript on the server side. However, as your applications grow, managing memory becomes a critical task for maintaining optimal performance and managing problems like memory leaks or crashes. In this article, we’ll explore how to monitor, manage, and optimize memory usage within Node.js. We’ll also cover important V8 concepts like the heap and garbage collection and discuss how to use command-line flags to fine-tune memory behavior.
9
+
Node.js, built on Google's V8 JavaScript engine, offers a powerful runtime for running JavaScript on the server side. However, as your applications grow, managing memory becomes a critical task for maintaining optimal performance and managing problems like memory leaks or crashes. In this article, we'll explore how to monitor, manage, and optimize memory usage within Node.js. We'll also cover important V8 concepts like the heap and garbage collection and discuss how to use command-line flags to fine-tune memory behavior.
10
10
11
11
## How V8 Manages Memory
12
12
13
13
At its core, V8 divides memory into several parts, with two primary areas being the **heap** and the **stack**. Understanding these spaces, especially how the heap is managed, is key to improving memory usage in your app.
14
14
15
15
### The Heap
16
16
17
-
V8's memory management is based on the generational hypothesis, the idea that most objects die young, so it separates the heap into generations to optimize garbage collection:
17
+
V8's memory management is based on the generational hypothesis, the idea that most objects die young. Therefore, it separates the heap into generations to optimize garbage collection:
18
18
19
-
1.**New Space**: This is where new, short-lived objects are allocated. Because objects here are expected to "die young", garbage collection occurs frequently, allowing memory to be reclaimed quickly.
19
+
1.**New Space**: This is where new, short-lived objects are allocated. Objects here are expected to "die young", so garbage collection occurs frequently, allowing memory to be reclaimed quickly.
20
20
21
-
2.**Old Space**: Objects that survive several garbage collection cycles in the New Space are moved to the Old Space. Since these objects are more persistent, garbage collection in this space occurs less often but is more resource-intensive when it happens.
21
+
For example, let's say you have an API that receives 1,000 requests per second. Each request generates a temporary object like `{ name: 'John', age: 30 }`, which is discarded once the request is processed. If you leave the New Space size at the default, V8 will frequently perform minor garbage collections to clear these small objects, ensuring that memory usage remains manageable.
22
+
23
+
2.**Old Space**: Objects that survive multiple garbage collection cycles in the New Space are promoted to the Old Space. These are usually long-lived objects, such as user sessions, cache data, or persistent state. Because these objects tend to last longer, garbage collection in this space occurs less often but is more resource-intensive.
24
+
25
+
Let's say you are running an application that tracks user sessions. Each session might store data like `{ userId: 'abc123', timestamp: '2025-04-10T12:00:00', sessionData: {...} }`, which needs to persist in memory as long as the user is active. As the number of concurrent users grows, the Old Space could fill up, causing out-of-memory errors or slower response times due to inefficient garbage collection cycles.
22
26
23
27
In V8, memory for JavaScript objects, arrays, and functions is allocated in the **heap**. The size of the heap is not fixed, and exceeding the available memory can result in an "out-of-memory" error, causing your application to crash.
24
28
25
29
To check the current heap size limit, you can use the `v8` module.
@@ -42,7 +46,7 @@ Whenever a function is called, a new frame is pushed onto the stack. When the fu
42
46
43
47
## Monitoring Memory Usage
44
48
45
-
Before tuning memory usage, it’s important to understand how much memory your application is consuming. Node.js and V8 provide several tools for monitoring memory usage.
49
+
Before tuning memory usage, it's important to understand how much memory your application is consuming. Node.js and V8 provide several tools for monitoring memory usage.
46
50
47
51
### Using `process.memoryUsage()`
48
52
@@ -54,13 +58,13 @@ The `process.memoryUsage()` method provides insights into how much memory your N
54
58
-**`external`**: Memory used by external resources like bindings to C++ libraries.
55
59
-**`arrayBuffers`**: Memory allocated to various Buffer-like objects.
56
60
57
-
Here’s an example:
61
+
Here's how to use `process.memoryUsage()` to monitor memory usage in your application:
58
62
59
63
```javascript
60
64
console.log(process.memoryUsage());
61
65
```
62
66
63
-
The output will look like:
67
+
The output will show how much memory is being used in each area:
64
68
65
69
```json
66
70
{
@@ -72,59 +76,61 @@ The output will look like:
72
76
}
73
77
```
74
78
75
-
By monitoring these values over time, you can identify if memory usage continually increases over time, a common sign of memory leaks.
79
+
By monitoring these values over time, you can identify if memory usage is increasing unexpectedly. For instance, if `heapUsed` steadily grows without being released, it could indicate a memory leak in your application.
76
80
77
81
## Command-Line Flags for Memory Tuning
78
82
79
-
Node.js offers several command-line flags that help you fine-tune memory-related settings. Below are some key options you can use to optimize memory usage.
83
+
Node.js offers several command-line flags to fine-tune memory-related settings, allowing you to optimize memory usage in your application.
80
84
81
85
### `--max-old-space-size`
82
86
83
87
This flag sets a limit on the size of the **Old Space** in the V8 heap, where long-lived objects are stored. If your application uses a significant amount of memory, you might need to adjust this limit.
84
88
85
-
To increase the Old Space limit to 4 GB, for example, use:
89
+
For example, lets say your application handles a steady stream of incoming requests, each of which generates a large object. Over time, if these objects are not cleared, the Old Space could become overloaded, causing crashes or slower response times.
90
+
91
+
You can increase the Old Space size by setting the `--max-old-space-size` flag:
86
92
87
93
```bash
88
94
node --max-old-space-size=4096 app.js
89
95
```
90
96
91
-
This sets the Old Space size to 4096 MB (4 GB), but you should adjust this based on your system's available memory.
97
+
This sets the Old Space size to 4096 MB (4 GB), which is particularly useful if your application is handling a large amount of persistent data, like caching or user session information.
92
98
93
99
### `--max-semi-space-size`
94
100
95
-
The `--max-semi-space-size`flag controls the size of the **New Space** in the V8 heap. New Space stores newly created objects, which are garbage collected frequently. Increasing this size can help reduce the frequency of minor garbage collections, which can in turn improve performance if your application allocates a lot of short-lived objects.
101
+
This flag controls the size of the **New Space** in the V8 heap. New Space is where newly created objectsare allocated and garbage collected frequently. Increasing this size can reduce the frequency of minor garbage collection cycles.
96
102
97
-
For example:
103
+
For example, if you have an API that receives a large number of requests, each creating small objects like `{ name: 'Alice', action: 'login' }`, you may notice performance degradation due to frequent garbage collection. By increasing the New Space size, you can reduce the frequency of these collections and improve overall performance.
98
104
99
105
```bash
100
106
node --max-semi-space-size=64 app.js
101
107
```
102
108
103
-
This increases the New Space to 64 MB (compared to the default, which is typically much smaller). This adjustment can help in high-throughput applications where frequent garbage collection of short-lived objects is adding noticeable overhead.
109
+
This increases the New Space to 64 MB, allowing for more objects to reside in memory before triggering garbage collection. This is particularly useful in high-throughput environments where object creation and destruction are frequent.
104
110
105
111
### `--gc-interval`
106
112
107
-
This flag adjusts how frequently garbage collection cycles occur. V8 normally determines this interval automatically based on heuristics, but in special cases, you might want to override it.
113
+
This flag adjusts how frequently garbage collection cycles occur. By default, V8 determines the best interval, but you can override this setting in some scenarios where you need more control over memory cleanup.
108
114
109
-
For example:
115
+
For example, in a real-time application like a stock trading platform, you may want to minimize the impact of garbage collection by reducing the frequency of collections, ensuring the application can process data without significant pauses.
110
116
111
117
```bash
112
118
node --gc-interval=100 app.js
113
119
```
114
120
115
-
Setting this to 100 forces V8 to attempt garbage collection every 100 ms. This might be useful in low-latency systems where predictable memory cleanup is needed, such as a long-running service that must consistently stay within a narrow memory footprint. However, changing this without careful testing can degrade performance due to excessive garbage collection.
121
+
This setting forces V8 to attempt garbage collection every 100 ms. You may need to adjust this interval for specific use cases, but be cautious: setting the interval too low can cause performance degradation due to excessive garbage collection cycles.
116
122
117
123
### `--expose-gc`
118
124
119
-
With the `--expose-gc` flag, you can manually trigger garbage collection from your application code. This can be helpful in specific cases where you know a large chunk of memory is no longer needed and want to release it immediately—for example, after processing a large data set or completing a batch job.
125
+
With the `--expose-gc` flag, you can manually trigger garbage collection from within your application code. This can be helpful in specific scenarios, like after processing a large batch of data, where you want to reclaim memory before continuing with further operations.
120
126
121
-
To enable it, start your app with:
127
+
To expose `gc`, start your app with:
122
128
123
129
```bash
124
130
node --expose-gc app.js
125
131
```
126
132
127
-
Then, within your code, you can call:
133
+
Then, within your application code, you can call`global.gc()` to manually trigger garbage collection:
0 commit comments