Interesting idea.  Does LastInsertID return the row that was a dupe?  I
suppose I can test that......

On Thu, Feb 2, 2017 at 11:34 AM, Paul Sanderson <
sandersonforens...@gmail.com> wrote:

> You could make the CmdEntered field unique, or create a hash on the
> uppercase content of the command and make that a unique key.
>
> Then use INSERT OR IGNORE...
> Paul
> www.sandersonforensics.com
> skype: r3scue193
> twitter: @sandersonforens
> Tel +44 (0)1326 572786
> http://sandersonforensics.com/forum/content.php?195-SQLite-
> Forensic-Toolkit
> -Forensic Toolkit for SQLite
> email from a work address for a fully functional demo licence
>
>
> On 2 February 2017 at 16:22, Stephen Chrzanowski <pontia...@gmail.com>
> wrote:
> > By a new requirement of my manager, we're asked to log all our SSH
> sessions
> > to our customer machines.  The current Windows search is a PITA, grepping
> > for text is burdensome considering the number of sessions I open per day,
> > and being a pack rat, I love reading about stuff I did years ago. :]
> (Not
> > to mention the CYA thing is primary reason for this move -- I'm not
> > complaining)
> >
> > So I'm thinking about writing a tool that'll take the output of the PUTTY
> > logs, read them line by line, and insert the data into a SQLite database
> > with some referential integrity that will allow me to search against what
> > server I'm connecting to, a range of dates the logs, particular text,
> etc.
> > (Granted there is a huge range of error that could happen with this that
> > I'm not anticipating, but, meh.  I need something)
> >
> > Getting the data from the logs into a database is a true no-brainer.
> Same
> > with parsing and deciding how I want to check for text and properly
> catalog
> > what I'm doing per machine.  Some putty sessions I jump between several
> > machines, so during the line reading, I'll be looking for keywords
> > (Hostnames) based on the command prompt since how our prompts are
> globally
> > the same across all machines.
> >
> > During the reading process, what I want to do is read the line in, check
> > the database to see what I've read in already exists and react
> accordingly
> > by adding the new entry and setting up the relationships in other tables.
> > Childs play, IMO.
> >
> > But, in my preplanning, scenario development and brain storming, the
> above
> > paragraph is going to destroy my machine doing a [ select * from CmdLine
> > where upper(CmdEntered) =upper('SomeText') ] every time I read a new line
> > from a new log file to verify if the entry has been made.  So my thought
> > leans towards FTS, but, I've never written anything dealing with that.
> >
> > Is there any kind of special preparation I need to do to the database to
> > get it working effectively?  Is there a particular way I have to make my
> > queries to see if previous text exists?  Is there a primer, not on the
> > theory of how it works in the back end, but, how to generate the SQL call
> > and deal with what comes out?  Are there any caveats I need to be aware
> > of?  Do I skip FTS and just roll my own word analyzer?
> >
> > Since Oct 2016, my logs are sitting at just shy of 700meg of text.
> Looking
> > for what I did on a particular machine even last month would be a pain at
> > this point.
> > _______________________________________________
> > sqlite-users mailing list
> > sqlite-users@mailinglists.sqlite.org
> > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>
_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to