Github user harshach commented on a diff in the pull request:

    https://github.com/apache/storm/pull/372#discussion_r22536766
  
    --- Diff: external/storm-jdbc/README.md ---
    @@ -0,0 +1,117 @@
    +#Storm HBase
    +
    +Storm/Trident integration for JDBC.
    +
    +## Usage
    +The main API for interacting with JDBC is the 
`org.apache.storm.jdbc.mapper.TupleToColumnMapper`
    +interface:
    +
    +```java
    +public interface JdbcMapper  extends Serializable {
    +    List<Column> getColumns(ITuple tuple);
    +}
    +```
    +
    +The `getColumns()` method defines how a storm tuple maps to a list of 
columns representing a row in a database.
    +
    +### SimpleJdbcMapper
    +`storm-jdbc` includes a general purpose `JdbcMapper` implementation called 
`SimpleJdbcMapper` that can map Storm
    +tuple to a Database row. `SimpleJdbcMapper` assumes that the tuple has 
fields with same name as the column name in 
    +the database table that you intend to write to.
    +
    +To use `SimpleJdbcMapper`, you simply tell it the tableName that you want 
to write to and provide a hikari configuration map.
    +
    +The following code creates a `SimpleJdbcMapper` instance that:
    +
    +1. Will allow the mapper to transform a storm tuple to a list of columns 
mapping to a row in table test.user_details.
    +2. Will use the provided HikariCP configuration to establish a connection 
pool with specified Database configuration and
    +automatically figure out the column names of the table that you intend to 
write to.
    +
    +```java
    +Map hikariConfigMap = Maps.newHashMap();
    
+hikariConfigMap.put("dataSourceClassName","com.mysql.jdbc.jdbc2.optional.MysqlDataSource");
    +hikariConfigMap.put("dataSource.url", "jdbc:mysql://localhost/test");
    +hikariConfigMap.put("dataSource.user","root");
    +hikariConfigMap.put("dataSource.password","password");
    +String tableName = "user_details";
    +JdbcMapper jdbcMapper = new SimpleJdbcMapper(tableName, map);
    +```
    +### JdbcBolt
    +To use the `JdbcBolt`, construct it with the name of the table to write 
to, and a `JdbcMapper` implementation. In addition
    +you must specify a configuration key that hold the hikari configuration 
map.
    +
    + ```java
    +Config config = new Config();
    +config.put("jdbc.conf", hikariConfigMap);
    +
    +JdbcBolt bolt = new JdbcBolt("user_details", jdbcMapper)
    +        .withConfigKey("jdbc.conf");
    --- End diff --
    
    is jdbc.conf is a properties file? if so withConfigKey can be renamed to 
withConfigFile. It will be great if you can add a sample jdbc.conf file to the 
README.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to