Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Simon Slavin

On 2 Feb 2017, at 8:05pm, Stephen Chrzanowski  wrote:

> There's a little bit more involved than just consolidating the files into
> one that I need.  Specifically, since the command line on all customer
> linux machines are formatted a certain way, I can easily identify what
> machine I'm specifically looking at, and filter results based on that.

Sorry, I forgot that.  So add another column:

name of text file
entry number (line in file ?)
computer identifier (IP address ?)
data

Step 1 can be to get it all into a SQLite database of any kind.  Worry about 
the final form and whether you’ll want to use FTS later.

Simon.
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Stephen Chrzanowski
Definitely radical and possible, but something I don't think I'd like to
take on, simply because I'm a Delphi dev'r, not C/C++, although, I did do
10 other peoples final C++ projects back in my college days, but that was
two decades ago. (I don't mind saying that, but man do I hate realizing
that time has passed!).  However, if that type of additional logging does
become a thing for Putty, I might want to put a reservation in the code
that the log file has that information somewhere.

But you're correct.  If the log outputs were able to put a time stamp per
line, that'd help with at least narrowing down the time resolution.  But
that isn't the primary purpose of this particular project.  Its just easy
search, fast updates to the database, and focused results.

On Thu, Feb 2, 2017 at 3:01 PM, Rob Willett 
wrote:

> I've been following this thread with interest. I have used Putty for years
> as its the de-facto standard for decent ssh terminals on Windows boxes.
>
> A slightly more radical suggestion for the log files. Since Putty is open
> source, have a look at the code and see if you can easily add in a
> timestamp per line for the log file section.
>
> That gives you two features:
>
> 1. You now have a verifiable traceable source as you have downloaded and
> compiled it. I have worked in environments where we need to trace every bit
> of code that comes into the data centre. We need to know where we
> downloaded it from, what the license was, authority from legal to use etc
> etc. Your situation might not warrant it :)
>
> 2. You now have unique lines with a timestamp, a hostname and a TTY
> session (I assume). I think that guarantees uniqueness.
>
> I have no idea if Putty can be modified in this way, but it wouldn't hurt
> to have a look, see if the change is easy, do the change and then send some
> diffs back to the Putty team. If they accept the changes you're sorted. If
> they don't well, Putty doesn't change that much over time so you could
> probably use your version for years to come.
>
> Rob
>
>
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Stephen Chrzanowski
There's a little bit more involved than just consolidating the files into
one that I need.  Specifically, since the command line on all customer
linux machines are formatted a certain way, I can easily identify what
machine I'm specifically looking at, and filter results based on that.
Because I'm looking at around 1000 files and 600meg of raw text, it isn't
easily consumable to a human, so, the only relevant identifiers I need
right now is the start date (Based on file name), end date (Based on file
time stamp), and the server itself (Which can be a mix of servers in a
single log file), which I said, is part of the command line.  The removal
of the duplication of each line in that text file is also important to me
as well as there is no reason to have 500 entries of my initiating the SSH
session to a customer site.

With exactly zero code written and not a database file created, I'm
planning on having log entry table with an auto numbered PK field, text
field, and a hash of the text field (Collisions may occur when [ new.text
<> old.text and new.hash = old.hash ] but I'm not interested in 100%
absolutely perfect results).  I'll then have a table that contains just a
list of server names picked up from reading the log files with a PK field
as well, and then one table that contains an auto numbered PK, reference to
the server PK, and a reference to the PK in the log entry table, and wiggle
in the date mentioned in the filename as well.


On Thu, Feb 2, 2017 at 11:59 AM, Simon Slavin  wrote:

>
>
> Under those circumstances, all you’re really doing by putting this data in
> a SQLite database is consolidating lots of separate files into one.  So
> import everything from those files, without checking for duplicates, using
> as your primary key the combination of logfile name and line number.  To
> avoid having to deal with errors from duplicates use
>
> INSERT OR IGNORE ...
>
> Once you’ve worked out how to get it into a SQLite database you can decide
> whether do searches using LIKE or FTS.  Or duplicate your database and
> experiment with both approaches to find your ideal balance of filesize and
> search speed.
>
> Simon.
> ___
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Rob Willett
I've been following this thread with interest. I have used Putty for 
years as its the de-facto standard for decent ssh terminals on Windows 
boxes.


A slightly more radical suggestion for the log files. Since Putty is 
open source, have a look at the code and see if you can easily add in a 
timestamp per line for the log file section.


That gives you two features:

1. You now have a verifiable traceable source as you have downloaded and 
compiled it. I have worked in environments where we need to trace every 
bit of code that comes into the data centre. We need to know where we 
downloaded it from, what the license was, authority from legal to use 
etc etc. Your situation might not warrant it :)


2. You now have unique lines with a timestamp, a hostname and a TTY 
session (I assume). I think that guarantees uniqueness.


I have no idea if Putty can be modified in this way, but it wouldn't 
hurt to have a look, see if the change is easy, do the change and then 
send some diffs back to the Putty team. If they accept the changes 
you're sorted. If they don't well, Putty doesn't change that much over 
time so you could probably use your version for years to come.


Rob

On 2 Feb 2017, at 19:53, Stephen Chrzanowski wrote:

I can only get to our customer machines by jumping into a server that 
has
access to both sides of the network.  Our side, and the customer side. 
 I
can't get to a customers machine directly.  The  is out, but I'm 
already

doing the rest.

The image in my head of what my program is going to do is that I feed 
it a
date range, a server I'm interested in, and optionally provide text 
that
further filters the information I'm looking for.  Once I have the 
filtered
data, I'd have a list of days that I'd been on that exact server, 
and/or

entries that mention my subject server, and I can see the text only
pertaining to that machine and date range.  I'd be able to read the 
full
set of activities on that machine for that day, and not have to hop 
around

to multiple log files..  This would get rid of the concept of many log
files as well, since all files are now one.  Kind of Borg-ish?


On Thu, Feb 2, 2017 at 11:54 AM, Donald Griggs  
wrote:



Maybe another method to consider:

This guy shows that Putty appears to support creating separate log 
files

for each session including a timestamp in the file name.

https://www.viktorious.nl/2013/01/14/putty-log-all-session-output/

Could your script import any new log files it sees, then move them to 
an

archive?

That way, you'd never have to read through a huge log file to find 
what

should be imported.


==From the page linked above:


I am using some putty parameters which will make every session 
unique, in

this case “”, which means:

   -  = hostname for the session
   -  = year
   -  = month
   -  = day
   -  = time

 ==
​
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Stephen Chrzanowski
I can only get to our customer machines by jumping into a server that has
access to both sides of the network.  Our side, and the customer side.  I
can't get to a customers machine directly.  The  is out, but I'm already
doing the rest.

The image in my head of what my program is going to do is that I feed it a
date range, a server I'm interested in, and optionally provide text that
further filters the information I'm looking for.  Once I have the filtered
data, I'd have a list of days that I'd been on that exact server, and/or
entries that mention my subject server, and I can see the text only
pertaining to that machine and date range.  I'd be able to read the full
set of activities on that machine for that day, and not have to hop around
to multiple log files..  This would get rid of the concept of many log
files as well, since all files are now one.  Kind of Borg-ish?


On Thu, Feb 2, 2017 at 11:54 AM, Donald Griggs  wrote:

> Maybe another method to consider:
>
> This guy shows that Putty appears to support creating separate log files
> for each session including a timestamp in the file name.
>
> https://www.viktorious.nl/2013/01/14/putty-log-all-session-output/
>
> Could your script import any new log files it sees, then move them to an
> archive?
>
> That way, you'd never have to read through a huge log file to find what
> should be imported.
>
>
> ==From the page linked above:
>
>
> I am using some putty parameters which will make every session unique, in
> this case “”, which means:
>
>-  = hostname for the session
>-  = year
>-  = month
>-  = day
>-  = time
>
>  ==
> ​
> ___
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Simon Slavin

On 2 Feb 2017, at 4:48pm, Stephen Chrzanowski  wrote:

> Unfortunately no, there is no time stamp at the command lines, and I can't
> add that ability  (Maybe if I setup my own new account on our jump-point
> server, but then I've got another kettle to deal with).  The only reference
> to a time is based on the filename that Putty creates the file, and the
> last modified time stamp on the file at the file system level.  Then I have
> that as a range of days, which should be good enough to start pin pointing
> what I may need to find.

Under those circumstances, all you’re really doing by putting this data in a 
SQLite database is consolidating lots of separate files into one.  So import 
everything from those files, without checking for duplicates, using as your 
primary key the combination of logfile name and line number.  To avoid having 
to deal with errors from duplicates use

INSERT OR IGNORE ...

Once you’ve worked out how to get it into a SQLite database you can decide 
whether do searches using LIKE or FTS.  Or duplicate your database and 
experiment with both approaches to find your ideal balance of filesize and 
search speed.

Simon.
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Stephen Chrzanowski
Unfortunately no, there is no time stamp at the command lines, and I can't
add that ability  (Maybe if I setup my own new account on our jump-point
server, but then I've got another kettle to deal with).  The only reference
to a time is based on the filename that Putty creates the file, and the
last modified time stamp on the file at the file system level.  Then I have
that as a range of days, which should be good enough to start pin pointing
what I may need to find.

For past entries, that would be the best time resolution I would have.
However, going forward, if this software were to run as a type of service
(Or a hidden app that runs in the background -- Its my machine, its
allowed), I could just watch when a file is changed and put in whatever is
new and the resolution would become time between checks.

The other duplicate lines that can show up isn't of what I type, but, what
is returned.  There are a lot of times I use WATCH to keep tabs on the
status of the age of files in a particular directory, or, spam LS to do
something similar, etc.  There is a lot of duplication (I can't even
imagine to what level as I just don't know) I'm just not aware of or
noticed, but, its something I know exists and something I'd like to trim
down.

(Not to mention, I really wanna get rid of 900 files off my drive and
consolidate into one)

On Thu, Feb 2, 2017 at 11:36 AM, Simon Slavin  wrote:

>
> On 2 Feb 2017, at 4:22pm, Stephen Chrzanowski  wrote:
>
> > But, in my preplanning, scenario development and brain storming, the
> above
> > paragraph is going to destroy my machine doing a [ select * from CmdLine
> > where upper(CmdEntered) =upper('SomeText') ] every time I read a new line
> > from a new log file to verify if the entry has been made.
>
> Wait wait wait.  I type duplicate commands all the time.
>
> When you create your text log does it not have a timestamp on every line
> ?  If not, can you make it do so ?  Then all you have to do is check
> whether the timestamp matches a column of the table.  Should be a
> fixed-length or delimited field, easy to isolate.
>
> Simon.
> ___
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Donald Griggs
Maybe another method to consider:

This guy shows that Putty appears to support creating separate log files
for each session including a timestamp in the file name.

https://www.viktorious.nl/2013/01/14/putty-log-all-session-output/

Could your script import any new log files it sees, then move them to an
archive?

That way, you'd never have to read through a huge log file to find what
should be imported.


==From the page linked above:


I am using some putty parameters which will make every session unique, in
this case “”, which means:

   -  = hostname for the session
   -  = year
   -  = month
   -  = day
   -  = time

 ==
​
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Simon Slavin

On 2 Feb 2017, at 4:22pm, Stephen Chrzanowski  wrote:

> But, in my preplanning, scenario development and brain storming, the above
> paragraph is going to destroy my machine doing a [ select * from CmdLine
> where upper(CmdEntered) =upper('SomeText') ] every time I read a new line
> from a new log file to verify if the entry has been made.

Wait wait wait.  I type duplicate commands all the time.

When you create your text log does it not have a timestamp on every line ?  If 
not, can you make it do so ?  Then all you have to do is check whether the 
timestamp matches a column of the table.  Should be a fixed-length or delimited 
field, easy to isolate.

Simon.
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Stephen Chrzanowski
Interesting idea.  Does LastInsertID return the row that was a dupe?  I
suppose I can test that..

On Thu, Feb 2, 2017 at 11:34 AM, Paul Sanderson <
sandersonforens...@gmail.com> wrote:

> You could make the CmdEntered field unique, or create a hash on the
> uppercase content of the command and make that a unique key.
>
> Then use INSERT OR IGNORE...
> Paul
> www.sandersonforensics.com
> skype: r3scue193
> twitter: @sandersonforens
> Tel +44 (0)1326 572786
> http://sandersonforensics.com/forum/content.php?195-SQLite-
> Forensic-Toolkit
> -Forensic Toolkit for SQLite
> email from a work address for a fully functional demo licence
>
>
> On 2 February 2017 at 16:22, Stephen Chrzanowski 
> wrote:
> > By a new requirement of my manager, we're asked to log all our SSH
> sessions
> > to our customer machines.  The current Windows search is a PITA, grepping
> > for text is burdensome considering the number of sessions I open per day,
> > and being a pack rat, I love reading about stuff I did years ago. :]
> (Not
> > to mention the CYA thing is primary reason for this move -- I'm not
> > complaining)
> >
> > So I'm thinking about writing a tool that'll take the output of the PUTTY
> > logs, read them line by line, and insert the data into a SQLite database
> > with some referential integrity that will allow me to search against what
> > server I'm connecting to, a range of dates the logs, particular text,
> etc.
> > (Granted there is a huge range of error that could happen with this that
> > I'm not anticipating, but, meh.  I need something)
> >
> > Getting the data from the logs into a database is a true no-brainer.
> Same
> > with parsing and deciding how I want to check for text and properly
> catalog
> > what I'm doing per machine.  Some putty sessions I jump between several
> > machines, so during the line reading, I'll be looking for keywords
> > (Hostnames) based on the command prompt since how our prompts are
> globally
> > the same across all machines.
> >
> > During the reading process, what I want to do is read the line in, check
> > the database to see what I've read in already exists and react
> accordingly
> > by adding the new entry and setting up the relationships in other tables.
> > Childs play, IMO.
> >
> > But, in my preplanning, scenario development and brain storming, the
> above
> > paragraph is going to destroy my machine doing a [ select * from CmdLine
> > where upper(CmdEntered) =upper('SomeText') ] every time I read a new line
> > from a new log file to verify if the entry has been made.  So my thought
> > leans towards FTS, but, I've never written anything dealing with that.
> >
> > Is there any kind of special preparation I need to do to the database to
> > get it working effectively?  Is there a particular way I have to make my
> > queries to see if previous text exists?  Is there a primer, not on the
> > theory of how it works in the back end, but, how to generate the SQL call
> > and deal with what comes out?  Are there any caveats I need to be aware
> > of?  Do I skip FTS and just roll my own word analyzer?
> >
> > Since Oct 2016, my logs are sitting at just shy of 700meg of text.
> Looking
> > for what I did on a particular machine even last month would be a pain at
> > this point.
> > ___
> > sqlite-users mailing list
> > sqlite-users@mailinglists.sqlite.org
> > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
> ___
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Paul Sanderson
You could make the CmdEntered field unique, or create a hash on the
uppercase content of the command and make that a unique key.

Then use INSERT OR IGNORE...
Paul
www.sandersonforensics.com
skype: r3scue193
twitter: @sandersonforens
Tel +44 (0)1326 572786
http://sandersonforensics.com/forum/content.php?195-SQLite-Forensic-Toolkit
-Forensic Toolkit for SQLite
email from a work address for a fully functional demo licence


On 2 February 2017 at 16:22, Stephen Chrzanowski  wrote:
> By a new requirement of my manager, we're asked to log all our SSH sessions
> to our customer machines.  The current Windows search is a PITA, grepping
> for text is burdensome considering the number of sessions I open per day,
> and being a pack rat, I love reading about stuff I did years ago. :]  (Not
> to mention the CYA thing is primary reason for this move -- I'm not
> complaining)
>
> So I'm thinking about writing a tool that'll take the output of the PUTTY
> logs, read them line by line, and insert the data into a SQLite database
> with some referential integrity that will allow me to search against what
> server I'm connecting to, a range of dates the logs, particular text, etc.
> (Granted there is a huge range of error that could happen with this that
> I'm not anticipating, but, meh.  I need something)
>
> Getting the data from the logs into a database is a true no-brainer.  Same
> with parsing and deciding how I want to check for text and properly catalog
> what I'm doing per machine.  Some putty sessions I jump between several
> machines, so during the line reading, I'll be looking for keywords
> (Hostnames) based on the command prompt since how our prompts are globally
> the same across all machines.
>
> During the reading process, what I want to do is read the line in, check
> the database to see what I've read in already exists and react accordingly
> by adding the new entry and setting up the relationships in other tables.
> Childs play, IMO.
>
> But, in my preplanning, scenario development and brain storming, the above
> paragraph is going to destroy my machine doing a [ select * from CmdLine
> where upper(CmdEntered) =upper('SomeText') ] every time I read a new line
> from a new log file to verify if the entry has been made.  So my thought
> leans towards FTS, but, I've never written anything dealing with that.
>
> Is there any kind of special preparation I need to do to the database to
> get it working effectively?  Is there a particular way I have to make my
> queries to see if previous text exists?  Is there a primer, not on the
> theory of how it works in the back end, but, how to generate the SQL call
> and deal with what comes out?  Are there any caveats I need to be aware
> of?  Do I skip FTS and just roll my own word analyzer?
>
> Since Oct 2016, my logs are sitting at just shy of 700meg of text.  Looking
> for what I did on a particular machine even last month would be a pain at
> this point.
> ___
> sqlite-users mailing list
> sqlite-users@mailinglists.sqlite.org
> http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users


[sqlite] New tool for PUTTY logging [Windows]

2017-02-02 Thread Stephen Chrzanowski
By a new requirement of my manager, we're asked to log all our SSH sessions
to our customer machines.  The current Windows search is a PITA, grepping
for text is burdensome considering the number of sessions I open per day,
and being a pack rat, I love reading about stuff I did years ago. :]  (Not
to mention the CYA thing is primary reason for this move -- I'm not
complaining)

So I'm thinking about writing a tool that'll take the output of the PUTTY
logs, read them line by line, and insert the data into a SQLite database
with some referential integrity that will allow me to search against what
server I'm connecting to, a range of dates the logs, particular text, etc.
(Granted there is a huge range of error that could happen with this that
I'm not anticipating, but, meh.  I need something)

Getting the data from the logs into a database is a true no-brainer.  Same
with parsing and deciding how I want to check for text and properly catalog
what I'm doing per machine.  Some putty sessions I jump between several
machines, so during the line reading, I'll be looking for keywords
(Hostnames) based on the command prompt since how our prompts are globally
the same across all machines.

During the reading process, what I want to do is read the line in, check
the database to see what I've read in already exists and react accordingly
by adding the new entry and setting up the relationships in other tables.
Childs play, IMO.

But, in my preplanning, scenario development and brain storming, the above
paragraph is going to destroy my machine doing a [ select * from CmdLine
where upper(CmdEntered) =upper('SomeText') ] every time I read a new line
from a new log file to verify if the entry has been made.  So my thought
leans towards FTS, but, I've never written anything dealing with that.

Is there any kind of special preparation I need to do to the database to
get it working effectively?  Is there a particular way I have to make my
queries to see if previous text exists?  Is there a primer, not on the
theory of how it works in the back end, but, how to generate the SQL call
and deal with what comes out?  Are there any caveats I need to be aware
of?  Do I skip FTS and just roll my own word analyzer?

Since Oct 2016, my logs are sitting at just shy of 700meg of text.  Looking
for what I did on a particular machine even last month would be a pain at
this point.
___
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users