Oracle JDBC Input - Practical Testing Approach
Quick Start Configuration
For initial testing with Oracle, here's the minimal working config:
ruby
input {
jdbc {
jdbc_connection_string => "jdbc:oracle:thin:@//hostname:1521/servicename"
jdbc_user => "user"
jdbc_password => "password"
jdbc_driver_library => "/path/to/ojdbc8.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_validate_connection => true
statement => "SELECT * FROM your_table WHERE ROWNUM <= 10"
}
}
output {
stdout { codec => rubydebug }
}
Critical Oracle-specific notes:
- Must use
Java::oracle.jdbc.driver.OracleDriver (note the Java:: prefix)
- Connection string format for service name:
jdbc:oracle:thin:@//host:port/service or for SID: jdbc:oracle:thin:@host:port:sid
- Download ojdbc8.jar or ojdbc11.jar from Oracle's website
How to Verify It's Working
1. Always start with stdout output
This is the fastest way to confirm data retrieval without Elasticsearch complications. Run:
bash
/opt/logstash/bin/logstash -f your-config.conf
If you see JSON output with your table data - connection works.
2. Check Logstash logs
Look in /var/log/logstash/logstash-plain.log for:
[INFO] Starting JDBC input - plugin loaded
Connection test passed - database reachable
- Error messages if something fails
3. For detailed debugging
Run with debug logging:
bash
/opt/logstash/bin/logstash -f config.conf --log.level=debug
This shows SQL execution, row counts, and connection details.
4. Once stdout works, verify Elasticsearch
bash
# Check index created
curl localhost:9200/_cat/indices?v | grep your-index
# View sample docs
curl localhost:9200/your-index/_search?pretty&size=2
Common Pitfalls
Driver not loading: Make sure jar path is absolute and readable by Logstash user. If still failing, try placing the jar in /opt/logstash/vendor/jar/jdbc/.
Connection string errors: Oracle is picky about format. If one doesn't work, try the alternate format (service name vs SID).
No data in ES but stdout shows data: Usually a mapping issue or output misconfiguration. Check Logstash logs for Elasticsearch errors.
Large table performance: Add jdbc_paging_enabled => true and jdbc_page_size => 50000 to avoid memory issues.
Recommended Testing Workflow
- Start with
ROWNUM <= 10 limit and stdout output
- Verify you see data in console
- Switch output to Elasticsearch
- Check a few documents made it to ES
- Then add scheduling (
schedule => "*/5 * * * *") and incremental queries
- Finally, remove ROWNUM limit for production
Useful Resources
If you're using Energy Logserver 7.x, the Integrations plugin has an Oracle Database Server integration that provides ready-made Logstash configs and dashboards - might be worth checking after you verify basic connectivity.