Skip to content

Commit e1e3316

Browse files
committed
add specific examples, fix quotes, optimize code
1 parent b6b19fd commit e1e3316

1 file changed

Lines changed: 28 additions & 22 deletions

File tree

apps/site/pages/en/learn/diagnostics/memory/understanding-and-tuning-memory.md

Lines changed: 28 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -6,28 +6,32 @@ authors: avivkeller
66

77
# Understanding and Tuning Memory
88

9-
Node.js, built on Googles V8 JavaScript engine, offers a powerful runtime for running JavaScript on the server side. However, as your applications grow, managing memory becomes a critical task for maintaining optimal performance and managing problems like memory leaks or crashes. In this article, well explore how to monitor, manage, and optimize memory usage within Node.js. Well also cover important V8 concepts like the heap and garbage collection and discuss how to use command-line flags to fine-tune memory behavior.
9+
Node.js, built on Google's V8 JavaScript engine, offers a powerful runtime for running JavaScript on the server side. However, as your applications grow, managing memory becomes a critical task for maintaining optimal performance and managing problems like memory leaks or crashes. In this article, we'll explore how to monitor, manage, and optimize memory usage within Node.js. We'll also cover important V8 concepts like the heap and garbage collection and discuss how to use command-line flags to fine-tune memory behavior.
1010

1111
## How V8 Manages Memory
1212

1313
At its core, V8 divides memory into several parts, with two primary areas being the **heap** and the **stack**. Understanding these spaces, especially how the heap is managed, is key to improving memory usage in your app.
1414

1515
### The Heap
1616

17-
V8's memory management is based on the generational hypothesis, the idea that most objects die young, so it separates the heap into generations to optimize garbage collection:
17+
V8's memory management is based on the generational hypothesis, the idea that most objects die young. Therefore, it separates the heap into generations to optimize garbage collection:
1818

19-
1. **New Space**: This is where new, short-lived objects are allocated. Because objects here are expected to "die young", garbage collection occurs frequently, allowing memory to be reclaimed quickly.
19+
1. **New Space**: This is where new, short-lived objects are allocated. Objects here are expected to "die young", so garbage collection occurs frequently, allowing memory to be reclaimed quickly.
2020

21-
2. **Old Space**: Objects that survive several garbage collection cycles in the New Space are moved to the Old Space. Since these objects are more persistent, garbage collection in this space occurs less often but is more resource-intensive when it happens.
21+
For example, let's say you have an API that receives 1,000 requests per second. Each request generates a temporary object like `{ name: 'John', age: 30 }`, which is discarded once the request is processed. If you leave the New Space size at the default, V8 will frequently perform minor garbage collections to clear these small objects, ensuring that memory usage remains manageable.
22+
23+
2. **Old Space**: Objects that survive multiple garbage collection cycles in the New Space are promoted to the Old Space. These are usually long-lived objects, such as user sessions, cache data, or persistent state. Because these objects tend to last longer, garbage collection in this space occurs less often but is more resource-intensive.
24+
25+
Let's say you are running an application that tracks user sessions. Each session might store data like `{ userId: 'abc123', timestamp: '2025-04-10T12:00:00', sessionData: {...} }`, which needs to persist in memory as long as the user is active. As the number of concurrent users grows, the Old Space could fill up, causing out-of-memory errors or slower response times due to inefficient garbage collection cycles.
2226

2327
In V8, memory for JavaScript objects, arrays, and functions is allocated in the **heap**. The size of the heap is not fixed, and exceeding the available memory can result in an "out-of-memory" error, causing your application to crash.
2428

2529
To check the current heap size limit, you can use the `v8` module.
2630

2731
```cjs
2832
const v8 = require('node:v8');
29-
const heapSizeLimit = v8.getHeapStatistics().heap_size_limit;
30-
const heapSizeInGB = heapSizeLimit / (1024 * 1024 * 1024);
33+
const { heap_size_limit } = v8.getHeapStatistics();
34+
const heapSizeInGB = heap_size_limit / (1024 * 1024 * 1024);
3135

3236
console.log(`${heapSizeInGB} GB`);
3337
```
@@ -42,7 +46,7 @@ Whenever a function is called, a new frame is pushed onto the stack. When the fu
4246

4347
## Monitoring Memory Usage
4448

45-
Before tuning memory usage, its important to understand how much memory your application is consuming. Node.js and V8 provide several tools for monitoring memory usage.
49+
Before tuning memory usage, it's important to understand how much memory your application is consuming. Node.js and V8 provide several tools for monitoring memory usage.
4650

4751
### Using `process.memoryUsage()`
4852

@@ -54,13 +58,13 @@ The `process.memoryUsage()` method provides insights into how much memory your N
5458
- **`external`**: Memory used by external resources like bindings to C++ libraries.
5559
- **`arrayBuffers`**: Memory allocated to various Buffer-like objects.
5660

57-
Here’s an example:
61+
Here's how to use `process.memoryUsage()` to monitor memory usage in your application:
5862

5963
```javascript
6064
console.log(process.memoryUsage());
6165
```
6266

63-
The output will look like:
67+
The output will show how much memory is being used in each area:
6468

6569
```json
6670
{
@@ -72,59 +76,61 @@ The output will look like:
7276
}
7377
```
7478

75-
By monitoring these values over time, you can identify if memory usage continually increases over time, a common sign of memory leaks.
79+
By monitoring these values over time, you can identify if memory usage is increasing unexpectedly. For instance, if `heapUsed` steadily grows without being released, it could indicate a memory leak in your application.
7680

7781
## Command-Line Flags for Memory Tuning
7882

79-
Node.js offers several command-line flags that help you fine-tune memory-related settings. Below are some key options you can use to optimize memory usage.
83+
Node.js offers several command-line flags to fine-tune memory-related settings, allowing you to optimize memory usage in your application.
8084

8185
### `--max-old-space-size`
8286

8387
This flag sets a limit on the size of the **Old Space** in the V8 heap, where long-lived objects are stored. If your application uses a significant amount of memory, you might need to adjust this limit.
8488

85-
To increase the Old Space limit to 4 GB, for example, use:
89+
For example, lets say your application handles a steady stream of incoming requests, each of which generates a large object. Over time, if these objects are not cleared, the Old Space could become overloaded, causing crashes or slower response times.
90+
91+
You can increase the Old Space size by setting the `--max-old-space-size` flag:
8692

8793
```bash
8894
node --max-old-space-size=4096 app.js
8995
```
9096

91-
This sets the Old Space size to 4096 MB (4 GB), but you should adjust this based on your system's available memory.
97+
This sets the Old Space size to 4096 MB (4 GB), which is particularly useful if your application is handling a large amount of persistent data, like caching or user session information.
9298

9399
### `--max-semi-space-size`
94100

95-
The `--max-semi-space-size` flag controls the size of the **New Space** in the V8 heap. New Space stores newly created objects, which are garbage collected frequently. Increasing this size can help reduce the frequency of minor garbage collections, which can in turn improve performance if your application allocates a lot of short-lived objects.
101+
This flag controls the size of the **New Space** in the V8 heap. New Space is where newly created objects are allocated and garbage collected frequently. Increasing this size can reduce the frequency of minor garbage collection cycles.
96102

97-
For example:
103+
For example, if you have an API that receives a large number of requests, each creating small objects like `{ name: 'Alice', action: 'login' }`, you may notice performance degradation due to frequent garbage collection. By increasing the New Space size, you can reduce the frequency of these collections and improve overall performance.
98104

99105
```bash
100106
node --max-semi-space-size=64 app.js
101107
```
102108

103-
This increases the New Space to 64 MB (compared to the default, which is typically much smaller). This adjustment can help in high-throughput applications where frequent garbage collection of short-lived objects is adding noticeable overhead.
109+
This increases the New Space to 64 MB, allowing for more objects to reside in memory before triggering garbage collection. This is particularly useful in high-throughput environments where object creation and destruction are frequent.
104110

105111
### `--gc-interval`
106112

107-
This flag adjusts how frequently garbage collection cycles occur. V8 normally determines this interval automatically based on heuristics, but in special cases, you might want to override it.
113+
This flag adjusts how frequently garbage collection cycles occur. By default, V8 determines the best interval, but you can override this setting in some scenarios where you need more control over memory cleanup.
108114

109-
For example:
115+
For example, in a real-time application like a stock trading platform, you may want to minimize the impact of garbage collection by reducing the frequency of collections, ensuring the application can process data without significant pauses.
110116

111117
```bash
112118
node --gc-interval=100 app.js
113119
```
114120

115-
Setting this to 100 forces V8 to attempt garbage collection every 100 ms. This might be useful in low-latency systems where predictable memory cleanup is needed, such as a long-running service that must consistently stay within a narrow memory footprint. However, changing this without careful testing can degrade performance due to excessive garbage collection.
121+
This setting forces V8 to attempt garbage collection every 100 ms. You may need to adjust this interval for specific use cases, but be cautious: setting the interval too low can cause performance degradation due to excessive garbage collection cycles.
116122

117123
### `--expose-gc`
118124

119-
With the `--expose-gc` flag, you can manually trigger garbage collection from your application code. This can be helpful in specific cases where you know a large chunk of memory is no longer needed and want to release it immediately—for example, after processing a large data set or completing a batch job.
125+
With the `--expose-gc` flag, you can manually trigger garbage collection from within your application code. This can be helpful in specific scenarios, like after processing a large batch of data, where you want to reclaim memory before continuing with further operations.
120126

121-
To enable it, start your app with:
127+
To expose `gc`, start your app with:
122128

123129
```bash
124130
node --expose-gc app.js
125131
```
126132

127-
Then, within your code, you can call:
133+
Then, within your application code, you can call `global.gc()` to manually trigger garbage collection:
128134

129135
```javascript
130136
global.gc();

0 commit comments

Comments
 (0)