Thanks a million! Working great!
Only thing is I had to create the connection string in app.config and then 
pass in the name of the connection. Would be great if I could pass in the 
raw connection details because I'm writing a dynamic import system which 
can take data from many different databases. Any ideas?

On that topic - what's the status of the source code and development of 
Rhino? I was using the 1.1.1.0 version downloaded via NuGet however, when I 
download the latest version (2 years old) from github, it's not the same 
thing. Do I have the latest version?

Thanks again!

On Friday, November 2, 2012 3:47:42 PM UTC, Jason Meckley wrote:
>
> the convention operation will automatically convert the name of the the 
> key to a parameter. here is an example.
>
> //sql
> insert into [table] ([column1], [column2]) values (@avlue1, @value2);
>
> /row
> new Row
> {
>     { "value1", 12345 },
>     { "value2", "hello world" },
> };
>
> If you don't use the convention method, then you you explicitly make the 
> row keys to the sql parameters.
>
> as for the connection string, i'm not sure if you can inline the 
> connection string or if it must be in the config file.
>
> On Friday, November 2, 2012 10:43:36 AM UTC-4, Bill wrote:
>>
>> Thanks for your really quick reply! I've been looking at it for a while 
>> and still not sure how the values will map to the table columns! :(
>> How do the schema fields from my "Read" operation find their way into the 
>> @value1, etc? Also, I guess the "connection string" can be specified inline 
>> (rather than held in app.config or similar) e.g. "Data 
>> Source=(local);Initial Catalog=mydb;Integrated Security=SSPI;" - would that 
>> be correct?
>>
>> Thanks in advance for any help :)
>>
>>
>> On Friday, November 2, 2012 12:12:25 PM UTC, Jason Meckley wrote:
>>>
>>> 1. implement an operation to read the records from the file. Inherit 
>>> AbstractOperation and use the FluentEngine API to read the file.
>>> 2. pick one of the Database out commands to insert the records in to the 
>>> database. If it's a sql db I usually go for ConventionSqlBatchOperation. 
>>> That's my preference over SqlBulkInsert.
>>>      If you are using the convention methods you may want an 
>>> intermediate operation to change the field names from whatever was imported 
>>> from file to the parameter names of the insert statement.
>>> 3. implement an ETL process that uses these operations.
>>> 4. run the process.
>>>
>>> here is an example
>>>
>>> class MyProcess: EtlProcess
>>> {
>>>    public void Register()
>>>    {
>>>           Register(new ReadFileOperation());
>>>           Register(new ConventionSqlBatchOperation(connection string 
>>> name) { BatchSize = 250, Command = "insert into [table] ([column1], 
>>> [column2]) values (@value1, @value2);" });
>>>    }
>>> }
>>>
>>> class ReadFileOperation : AbstractOperation
>>> {
>>>     //going from memory so this could be wrong, but it looks something 
>>> like this...
>>>     public IEnumerable<Row> void Execute(IEnumerbale<Row> rows)
>>>    {
>>>           return FluentEngine<Dto>().Read(file name);
>>>    }
>>> }
>>>
>>> //to run from C#
>>> new MyProcess().Execute();
>>>
>>> On Friday, November 2, 2012 6:16:59 AM UTC-4, Bill wrote:
>>>>
>>>> Hi,
>>>> Could someone point me towards a simple example of importing a file and 
>>>> outputting it to a database table? Just getting started - looks great but 
>>>> just trying to get my head around it. I need to better understand how DB 
>>>> connections are made, how the schema is used, etc.
>>>>
>>>> Thanks!
>>>>
>>>

-- 
You received this message because you are subscribed to the Google Groups 
"Rhino Tools Dev" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/rhino-tools-dev/-/TUexRq2cmw8J.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/rhino-tools-dev?hl=en.

Reply via email to