The Cosmo Router plugin framework provides built-in logging capabilities that integrate seamlessly with your plugin implementation. You can enable structured logging with different output formats and log levels to help with debugging and monitoring your plugins.
For advanced use cases where you need full control over the logger configuration:
Copy
Ask AI
func WithCustomLogger(logger hclog.Logger)
The WithCustomLogger function accepts any implementation of the hclog.Logger interface, giving you complete flexibility in your logging setup. This means you can:
Usehclog.New() with custom configuration
Implement your own logger that wraps other logging libraries (logrus, zap, etc.)
Create adapters to existing logging infrastructure
Build specialized loggers for specific requirements (e.g., filtering, routing, formatting)
Important: When using a custom logger with non-JSON format, you must disable timestamps in your logger configuration. The go-plugin framework parses log lines by checking if they start with the log level (e.g., [TRACE], [INFO]). If timestamps are included, the line will start with the timestamp instead of the level, causing the plugin framework to always default to debug level.This restriction does not apply to JSON-formatted logs, as they use structured parsing instead of line prefix detection.
Since plugin logs are integrated into the router’s zap logger, they will appear in the router’s log output format. Here are examples of how your plugin logs will appear:
Copy
Ask AI
10:57:12 AM INFO darwin_arm64 grpcconnector/plugin_logger.go:48 QueryProjectStatuses called {"hostname": "cosmo", "pid": 71435, "service": "@wundergraph/router", "service_version": "dev", "timestamp": "2025-08-12T10:57:12.049+0200"}10:57:12 AM INFO darwin_arm64 grpcconnector/plugin_logger.go:48 Processing request {"hostname": "cosmo", "pid": 71435, "service": "@wundergraph/router", "service_version": "dev", "timestamp": "2025-08-12T10:57:12.050+0200", "project_count": 5}10:57:12 AM WARN darwin_arm64 grpcconnector/plugin_logger.go:48 Large number of projects requested {"hostname": "cosmo", "pid": 71435, "service": "@wundergraph/router", "service_version": "dev", "timestamp": "2025-08-12T10:57:12.051+0200", "count": 150}10:57:12 AM INFO darwin_arm64 grpcconnector/plugin_logger.go:48 Creating new project {"hostname": "cosmo", "pid": 71435, "service": "@wundergraph/router", "service_version": "dev", "timestamp": "2025-08-12T10:57:12.052+0200", "name": "My Project", "description": "A new project", "user_id": "user123"}10:57:12 AM INFO darwin_arm64 grpcconnector/plugin_logger.go:48 Project created successfully {"hostname": "cosmo", "pid": 71435, "service": "@wundergraph/router", "service_version": "dev", "timestamp": "2025-08-12T10:57:12.055+0200", "project_id": "proj_456", "name": "My Project", "duration_ms": 3}10:57:12 AM ERROR darwin_arm64 grpcconnector/plugin_logger.go:48 Failed to create project {"hostname": "cosmo", "pid": 71435, "service": "@wundergraph/router", "service_version": "dev", "timestamp": "2025-08-12T10:57:12.056+0200", "error": "database connection failed", "name": "Invalid Project", "duration_ms": 1}
As you can see, your plugin logs are seamlessly integrated with the router’s logging system, including all the router metadata like hostname, PID, service information, and timestamps.
When using JSON logging in production, the structured output integrates well with log aggregation systems like:
ELK Stack (Elasticsearch, Logstash, Kibana)
Fluentd/Fluent Bit
Grafana Loki
Cloud logging services (AWS CloudWatch, Google Cloud Logging, etc.)
The structured JSON format makes it easy to query, filter, and create dashboards based on your plugin logs.See also: Plugins · gRPC Services · GraphQL Support