lib: added logger api in node core#60468
Conversation
|
Review requested:
|
|
thats my first performance results: I think need improvement for child logger area, but ı'm so happy because other results it looks nice. node:logger vs pino ➜ node git:(mert/create-logger-api/node-core) ✗ ./node benchmark/logger/vs-pino.js n=100000
logger/vs-pino.js scenario="simple" logger="node-logger" n=100000: 5,240,540.823813018
logger/vs-pino.js scenario="child" logger="node-logger" n=100000: 2,635,847.7027229806
logger/vs-pino.js scenario="disabled" logger="node-logger" n=100000: 159,436,487.67795104
logger/vs-pino.js scenario="fields" logger="node-logger" n=100000: 3,619,336.304205216
logger/vs-pino.js scenario="simple" logger="pino" n=100000: 3,398,489.9761368227
logger/vs-pino.js scenario="child" logger="pino" n=100000: 4,489,799.803418606
logger/vs-pino.js scenario="disabled" logger="pino" n=100000: 119,772,384.56038144
logger/vs-pino.js scenario="fields" logger="pino" n=100000: 1,257,930.8609750536 |
|
I now learn this feat in Pino I will try add in node:logger |
|
This will require support for serializers |
I wonder, should we name them like Pino does (built-in serializers), or go with something like standardSerializers ? |
Follow pino and we’ll change it |
I tryed serializer implement for logger, and some bench result repaired and fields and simple are experiencing a decline; I will try to resolve these fields: 3.62M → 2.16M (-40%) previously, the results for the child logger were quite slow at around 70%, but the new results have dropped to 18% and have actually improved significantly. I continue to try new methods. |
I inspected pino and I learn some patterns, than I applied this commit and new results! 1ededc7 I used this patterns removed the cost of serializing bindings in each log for the child logger, simple: 6.06M vs 3.48M ops/s (+74% faster) |
|
hello @mcollina do you any comments or suggestions for this end commits, I'm curious 🙏 . |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #60468 +/- ##
==========================================
- Coverage 89.68% 89.64% -0.05%
==========================================
Files 706 708 +2
Lines 218222 219823 +1601
Branches 41768 42092 +324
==========================================
+ Hits 195717 197062 +1345
- Misses 14423 14641 +218
- Partials 8082 8120 +38
🚀 New features to boost your workflow:
|
f5b0108 to
429359b
Compare
2d9d338 to
de802ea
Compare
a9d7b73 to
b31a049
Compare
|
Hello everyone, I pushed a small follow-up, moved the hasSubscribers check before input validation so disabled consumers skip parsing (the original intent from @atlowChemi / @Qard). Plus a few doc bits: @RafaelGSS this round also covers your remaining points (channel names + Stability 1.1 on child), mind taking another look when you get a chance? Thanks again for all the reviews. |
b31a049 to
3bf29a1
Compare
3bf29a1 to
0f9833a
Compare
Co-authored-by: Chemi Atlow <chemi@atlow.co.il>
Co-authored-by: Chemi Atlow <chemi@atlow.co.il>
Co-authored-by: Chemi Atlow <chemi@atlow.co.il>
Co-authored-by: Chemi Atlow <chemi@atlow.co.il>
|
https://github.com/nodejs/node/actions/runs/24440675298/job/71405021240?pr=60468 I looked this error, I guess sometimes this is a flaky test, I continue inspecting |
|
Hello again I try this two test in my linux machine and get a passed mert@lima-default:/Users/mert/Desktop/enjoy/linux-node/node$ out/Release/node test/parallel/test-snapshot-reproducible.js
mert@lima-default:/Users/mert/Desktop/enjoy/linux-node/node$ |
| // Add consumer fields | ||
| const consumerFields = this.#fields; | ||
| for (const key of ObjectKeys(consumerFields)) { | ||
| json += `,${JSONStringify(key)}:${JSONStringify(consumerFields[key])}`; |
There was a problem hiding this comment.
We need a if check to ensure consumerFIelds[key] is not undefined, otherwise will produce invalid JSON.
There was a problem hiding this comment.
And same question as the other, what we do when fields has keys conflicting with other fields?
There was a problem hiding this comment.
Fixed, skipping undefined in handle() too.
There was a problem hiding this comment.
And same question as the other, what we do when fields has keys conflicting with other fields?
By design, the last used value takes precedence (can you check the field override priority test?)
The level time msg fields are reserved.
| const serialized = this.#serializeValue(bindings[key], key); | ||
| result += `,${JSONStringify(key)}:${JSONStringify(serialized)}`; |
There was a problem hiding this comment.
Same for this, check for undefined to avoid invalid JSON.
There was a problem hiding this comment.
Skipping undefined in #serializeBindings too.
| /** | ||
| * Attach this consumer to log channels | ||
| */ | ||
| attach() { | ||
| for (const level of LEVEL_NAMES) { | ||
| if (this[level].enabled) { | ||
| const handler = this.handle.bind(this); | ||
| this.#handlers[level] = handler; | ||
| channels[level].subscribe(handler); | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
Add flag to ensure we can't attach twice
There was a problem hiding this comment.
Added #attached flag, attach are idempotent now.
| * `stream` {number|string|Object} Output destination. Can be a file | ||
| descriptor (number), file path (string), or a writable stream object. |
There was a problem hiding this comment.
Would not be better to create an interface for object and explicitly declare the methods allowed? Eg: JSONConsumer calls flush/flushSync so I can't use it with createWriteStream
There was a problem hiding this comment.
Now requires write, flush, flushSync, end, docs list them.
| }; | ||
|
|
||
| // Apply serializers to additional fields | ||
| if (fields) { |
There was a problem hiding this comment.
missing validate fields as object when is defined.
There was a problem hiding this comment.
Hoisted validateObject(fields, 'fields') so it runs for all branches.
| let logFields; | ||
|
|
||
| if (isNativeError(msgOrObj)) { | ||
| msg = msgOrObj.message; |
There was a problem hiding this comment.
If this is undefined, it will likely produce invalid json
There was a problem hiding this comment.
Added ?? '' fallback for subclasses that leave message undefined.
| logFields = fields ? this.#applySerializers(fields) : kEmptyObject; | ||
| } else { | ||
| const { msg: extractedMsg, ...restFields } = msgOrObj; | ||
| msg = extractedMsg; |
There was a problem hiding this comment.
Same for this one, can produce invalid json
There was a problem hiding this comment.
Already guarded by validateString(msgOrObj.msg) above.
| const fields = record.fields; | ||
| for (const key in fields) { | ||
| json += `,${JSONStringify(key)}:${JSONStringify(fields[key])}`; | ||
| } |
There was a problem hiding this comment.
Why not serialize the entire fields and just remove first/last bracket?
What happens when fields has keys that conflicts with other fields?
There was a problem hiding this comment.
Bulk stringify: follow-up with a bench.
ast-wins by design, level, time, msg reserved.
Description
Adds an experimental
node:loggermodule that provides a structured, high-performance logging API for Node.js. Available behind the--experimental-loggerflag.Refs: #49296
Architecture
diagnostics_channelas the dispatch mechanism, loggers publish log records to level-specific channels, consumers subscribe to receive themnoopfunctions, eliminating runtime level checksUtf8Stream(SonicBoom port) for high-throughput writesAPI Surface
Logger, Producer class withtrace,debug,info,warn,error,fatalmethodsLogConsumer, Base consumer class for custom log handlingJSONConsumer, Built-in consumer that outputs structured JSON viaUtf8Streamchild(bindings, options), Child loggers with inherited context and serializerslogger.<level>.enabled, Getter for conditional logging (typo-safe)stdSerializers, Built-in serializers forerr,req,resserializesymbol,Symbol.for('nodejs.logger.serialize'), custom object serialization hook (stacked with field serializers)Log levels follow RFC 5424 numerical ordering internally (
trace: 10, debug: 20, info: 30, warn: 40, error: 50, fatal: 60). The API accepts string level names.diagnostics_channelchannel names (log:trace,log:debug,log:info,log:warn,log:error,log:fatal) are used internally and can be subscribed to directly viadiagnostics_channelif needed.Performance
Benchmarked against Pino (output to
/dev/null, n=100000):New files
lib/logger.js, Main module (Logger,LogConsumer,JSONConsumer)lib/internal/logger/serializers.js, Standarderr/req/resserializersdoc/api/logger.md, Full API documentationbenchmark/logger/basic-json.js, Benchmarkstest/parallel/test-log-basic.js, Core logger teststest/parallel/test-logger-serializers.js, Serializer testsModified files
src/node_options.h/src/node_options.cc,--experimental-loggerflaglib/internal/process/pre_execution.js,setupLogger()to allow module loading when flag is setlib/internal/bootstrap/realm.js, Module registrationdoc/api/cli.md, Flag documentationdoc/api/index.md/doc/node.1, Index updatestest/parallel/test-code-cache.js/test-process-get-builtin.mjs/test-require-resolve.js, Test adjustments for new module