Re: Tls toolkit the

2016-09-21 Thread Bryan Rosander
Hi Thomas,

Currently the tls-toolkit can be used in either standalone or client/server
mode.  Standalone has a couple of ease-of-use advantages such as being able
to be run from a single location to generate keystores and truststores as
well as nifi.properties files for the whole NiFi cluster.  If it meets your
requirements, it's probably the easier choice at this point.

Client/server is more appropriate when you're provisioning a cluster that
you don't know the size of in advance and would like to be able to have
nodes request their own certificates on-demand using a shared secret for
authentication.  This usecase requires more custom tooling (to integrate
the config.json into nifi.properties, locate CA server, etc) and will
hopefully be easier in future releases.

You can find some documentation in the admin guide (
https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#tls-generation-toolkit),
developer level documentation in the developer guide (
https://nifi.apache.org/docs/nifi-docs/html/developer-guide.html#tls-toolkit)
and there is usage information in the toolkit itself.


tls-toolkit.sh -h

tls-toolkit.sh standalone -h

tls-toolkit.sh server -h

tls-toolkit.sh client -h


Thanks,

Bryan

On Sep 21, 2016 10:57 PM, "Tijo Thomas"  wrote:

> Hi
>
> Can any one guide on how to use tls toolkit.  In the code I found that
> there are 3  services.  Standalone,  server and  client.
> I created a cluster with Standalone service by following  a blog.
>  but not sure where client and server  will be used.
>
> Any doc on this is highly appreciated
>
>
>
> Sent from Yahoo Mail on Android
> 
>


Tls toolkit the

2016-09-21 Thread Tijo Thomas
Hi 
Can any one guide on how to use tls toolkit.  In the code I found that there 
are 3  services.  Standalone,  server and  client.  I created a cluster with 
Standalone service by following  a blog.   but not sure where client and server 
 will be used.  
Any doc on this is highly appreciated 


Sent from Yahoo Mail on Android

Re: UI: flow status and counters feedback

2016-09-21 Thread Andrew Grande
Alright guys, do we have enough consensus to start filing jira work items?
:)

Andrew

On Tue, Sep 20, 2016, 2:01 PM Andrew Grande  wrote:

> Let's fade the connection slowly to an inverted if backpressure engages?
>
> On Tue, Sep 20, 2016, 1:17 PM Rob Moran  wrote:
>
>> Agreed – thanks for calling that out, Andy.
>>
>> Rob
>>
>> On Tue, Sep 20, 2016 at 1:13 PM, Andy LoPresto 
>> wrote:
>>
>>> In this and other UI discussions going on, I would request that everyone
>>> keep in mind the usability of the software by people with visual and other
>>> impairments. The US Federal Government has guidelines referred to as
>>> “Section 508” [1] which cover the design and usability of softwares
>>> specifically to ensure access for as many people as possible. Now, NiFi is
>>> not explicitly governed by these rules, but it seems to me that we should
>>> work towards accessibility from the beginning, not as a bolt-on effort.
>>>
>>> In that vein, one of the simplest and easiest rules is “color is great
>>> as a secondary indicator, but should not be the *only* indicator”. In
>>> practice — changing the color of a connection to indicate back pressure is
>>> a great feature, but there should be another indicator of back pressure
>>> that does not require the ability to discern color.
>>>
>>> [1]
>>> https://www.section508.gov/content/learn/standards/quick-reference-guide
>>>
>>> Andy LoPresto
>>> alopre...@apache.org
>>> *alopresto.apa...@gmail.com *
>>> PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69
>>>
>>> On Sep 20, 2016, at 8:28 AM, Andrew Grande  wrote:
>>>
>>> I like the tooltip addition of yours.
>>>
>>> For more interactive feedback on the canvas I can immediately think of 2
>>> items.
>>>
>>> 1. Indicator for when backpressure was configured on a connection
>>> (although it's now always added by default, maybe less useful).
>>>
>>> 2. Changing the color of a connection when backpressure has engaged
>>> could go a long way. Can go further, gradient color based on how close the
>>> connection backlog is to triggering the backpressure controls. Immediately
>>> highlights hotspots visually.
>>>
>>> Andrew
>>>
>>> On Tue, Sep 20, 2016, 9:40 AM Rob Moran  wrote:
>>>
 Andrew,

 Thanks for the feedback on the status bar. Separation between each item
 helps but realize after your comments how it can not feel like a single,
 cohesive group of items. We could probably tighten things up a bit.

 I think another part of this that could help would be to address some
 of the discussion around awareness of stats updating. Being able to call
 more attention (without being too intrusive) when stats change could help
 ease some of the burden of having to routinely scan the status bar to look
 for changes.

 Also related, I would like to see us get a tooltip that is seen when
 you hover anywhere on the status bar. That tooltip would provide more
 descriptive text about what each item means. It would help new users learn
 as well as provide detail and follow-on action when something is alerted.

 Let's see what others think and then I can work on filing a jira to
 capture thoughts.

 Rob

 On Mon, Sep 19, 2016 at 6:22 PM, Andrew Grande 
 wrote:

> Hi All,
>
> I'd like to provide some feedback on the NiFi 1.0 UI now that I had a
> chance to use it for a while, as well as pass along what I heard directly
> from other end users.
>
> Attached is a screenshot of a status bar right above the main flow
> canvas. The biggest difference from the 0.x UI is how much whitespace it
> now has between elements. To a point where it's not possible to quickly
> scan the state with a glance.
>
> Does anyone have other opinions? Can we adjust things slightly so they
> are easier on the eye an have less horizontal friction?
>
> Thanks!
> Andrew
>
>
>

>>>
>>


Re: Generating SQL from JSON

2016-09-21 Thread Karthik Ramakrishnan
Thanks Bryan. My bad. I missed this big time.

Regards,
Karthik

On Wed, Sep 21, 2016 at 2:26 PM, Bryan Bende  wrote:

> Hello,
>
> Have you taken a look at ConvertJSONToSQL?
>
> https://nifi.apache.org/docs/nifi-docs/components/org.
> apache.nifi.processors.standard.ConvertJSONToSQL/index.html
>
> -Bryan
>
>
> On Wed, Sep 21, 2016 at 2:55 PM, Karthik Ramakrishnan <
> karthik.ramakrishna...@gmail.com> wrote:
>
>> Hello -,
>>
>> I wish to know if there are any processors that do this? I have seen most
>> of them and none seem to fit the purpose well, or it seems so to me. I have
>> a JSON and it contains all the data information for the row and I would
>> extract all the attributes and then make a SQL statement out of it and that
>> would be executed by PutSQL. Is this possible or should I go for a custom
>> processor?
>>
>> --
>> Thanks,
>> *Karthik Ramakrishnan*
>> *Contact : +1 (469) 951-8854 <%2B1%20%28469%29%20951-8854>*
>>
>>
>


-- 
Thanks,
*Karthik Ramakrishnan*
*Contact : +1 (469) 951-8854*


Re: Generating SQL from JSON

2016-09-21 Thread Bryan Bende
Hello,

Have you taken a look at ConvertJSONToSQL?

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.ConvertJSONToSQL/index.html

-Bryan


On Wed, Sep 21, 2016 at 2:55 PM, Karthik Ramakrishnan <
karthik.ramakrishna...@gmail.com> wrote:

> Hello -,
>
> I wish to know if there are any processors that do this? I have seen most
> of them and none seem to fit the purpose well, or it seems so to me. I have
> a JSON and it contains all the data information for the row and I would
> extract all the attributes and then make a SQL statement out of it and that
> would be executed by PutSQL. Is this possible or should I go for a custom
> processor?
>
> --
> Thanks,
> *Karthik Ramakrishnan*
> *Contact : +1 (469) 951-8854 <%2B1%20%28469%29%20951-8854>*
>
>


Re: Idea needed to get XL from SharePoint using NiFi

2016-09-21 Thread Matt Foley
​Ram, info that may be of use:


Apache Camel has endpoints that can be used declaratively (with little or no 
Java programming -- which doesn't mean easy-to-use; they are under-documented! 
:-).

These Camel endpoints can be used for NiFi within the NiFi 
SpringContextProcessor.

And Camel has a CMIS endpoint that theoretically can be used to talk to 
Sharepoint.


I am working on pulling Salesforce data into NiFi using the Apache Camel 
endpoint for Salesforce in a NiFi SpringContextProcessor.  (Salesforce has a 
complex REST API behind an OAuth login).  I'm having some problems, but getting 
there.


For the Camel CMIS endpoint, it isn't clear that it actually works with 
Sharepoint.  There's a history in google of some folks trying to use it, with 
unclear results.  (Note the google results list is obfuscated by the existence 
of a Sharepoint API called "CAML", which is unrelated to Apache Camel!)  A 
couple relevant links are:

  *   https://code.google.com/archive/p/camel-cmis/
  *   
https://brandingsharepointsites.wordpress.com/2011/11/24/using-camel-query-to-get-specific-items-from-sharepoint-list-or-library/
  *   
http://stackoverflow.com/questions/29621858/using-apache-camel-cmis-with-sharepoint-2013
  *   https://issues.apache.org/jira/browse/CAMEL-6124

You'll have to decide if exploring Camel CMIS is worth the time, vs just 
jumping in and creating a custom Processor for Sharepoint -- using, perhaps, 
the Apache Chemistry library, which is the underlying library used by the Camel 
CMIS implementation!  
http://dhartford.blogspot.com/2013/01/sharepoint-2013-w-apache-chemistry-cmis.html​
 may be useful to read whether using Apache Chemistry library directly, or 
trying to use Apache Camel CMIS.

Let us know which approach you pursue.
Regards,
--Matt



From: Aldrin Piri 
Sent: Wednesday, September 21, 2016 7:47 AM
To: users@nifi.apache.org
Subject: Re: Idea needed to get XL from SharePoint using NiFi

Hi Ram,

Accomplishing this task would require some effort beyond those items currently 
bundled with NiFi.

I am not well versed in Sharepoint internals, but it seems like there is an 
available REST API [1] that would allow you to access files and folders [2].  
Depending on the nuances of authentication, InvokeHTTP may be a reasonable 
solution to both query an endpoint and serve as a means for finding additional 
information to get specific files and folders.  Doing some quick searching on 
top of that, it appears there are a few libraries available that may facilitate 
acquiring files from Sharepoint which could be used in a custom processor to 
streamline the process.

Once you are able to access Sharepoint and gather the documents of interest, 
some transform would be needed from the xls(x) format to CSV.  I have a vague 
recollection of someone in the community working with Apache POI [3] to 
facilitate this process but cannot seem to find the reference with some 
searching.  This would also require custom processor development.

The suite of ExecuteProcess, ExecuteStreamCommand, ExecuteScript, 
InvokeScripted processors may provide some additional avenues for exploration 
if you currently have applications that accomplish some of this presently or 
have comfort working with JVM-friendly scripting languages.

The final steps of saving to disk and/or loading a database are supported and 
should cover your needs beyond the data capture and transformation.

[1] https://msdn.microsoft.com/EN-US/library/office/fp142380.aspx
[2] https://msdn.microsoft.com/en-us/library/office/dn292553.aspx
[3] http://poi.apache.org/

On Mon, Sep 19, 2016 at 4:58 PM, Nathamuni, Ramanujam 
mailto:rnatham...@tiaa.org>> wrote:
Hello:

I need to get the XL files on windows SharePoint 2013  and store as CSV file or 
load into database? Please give me the things to try out?

Thanks,
Ram

*
This e-mail may contain confidential or privileged information.
If you are not the intended recipient, please notify the sender immediately and 
then delete it.

TIAA
*



Generating SQL from JSON

2016-09-21 Thread Karthik Ramakrishnan
Hello -,

I wish to know if there are any processors that do this? I have seen most
of them and none seem to fit the purpose well, or it seems so to me. I have
a JSON and it contains all the data information for the row and I would
extract all the attributes and then make a SQL statement out of it and that
would be executed by PutSQL. Is this possible or should I go for a custom
processor?

-- 
Thanks,
*Karthik Ramakrishnan*
*Contact : +1 (469) 951-8854*


Re: Does NiFi support multiple queries

2016-09-21 Thread Karthik Ramakrishnan
Thank you people!! Great help.

On Wed, Sep 21, 2016 at 7:19 AM, Peter Wicks (pwicks) 
wrote:

> Karthik,
>
>
>
> PutSQL will handle both styles, and for multiple tables, without issue.
>
>
>
> Internally it creates a separate SQL Batch for each distinct SQL statement
> in the queue and then executes these batches separately.  Feel free to mix
> as many Inserts/Updates as you wish for as many tables as you wish.
>
>
>
> Thanks,
>
>   Peter
>
>
>
> *From:* Karthik Ramakrishnan [mailto:karthik.ramakrishna...@gmail.com]
> *Sent:* Tuesday, September 20, 2016 9:57 PM
> *To:* users@nifi.apache.org
> *Subject:* Does NiFi support multiple queries
>
>
>
> Hello -
>
>
>
> I was wondering if NiFi can support multiple queries in the same PutSQL
> processor. For example, if an attribute is set to 'update' - will PutSQL
> run the defined update query and next time when it is an 'insert' - it runs
> the insert query. Or should we go ahead and add two separate processors and
> make a decision on the RouteAttributes processor? Any thoughts would be
> welcome!!
>
>
>
> TIA!!
>
>
>
> --
>
> Thanks,
>
> *Karthik Ramakrishnan*
>
> *Data Services Intern*
>
> *Copart Inc.*
>
> *Contact : +1 (469) 951-8854 <%2B1%20%28469%29%20951-8854>*
>
>
>



-- 
Thanks,
*Karthik Ramakrishnan*
*Contact : +1 (469) 951-8854*


Looking for feedback from MapR users using Kerberized clusters

2016-09-21 Thread Andre
Hi there,

I am in the process of documenting the process required to integrate NiFi
and MapR and I would like to confirm if the following process works for you
folks as well:

*Is there any way to connect to a MapR cluster using the HDFS compatible
API?*

   - A: Yes but it requires compiling NiFI from source. The process is
   rather straight forward:


*If the cluster is running in secure mode (Kerberos)*


   1. install mapr-client

   2. configure mapr-client as per MapR documentation

   3. After configuring the secure client, confirm everything is working by
   using

   `maprlogin kerberos` or `maprlogin password`

   followed by

   `hadoop fs -ls -d`

   4. compile NiFi using the adequate profile:
   mvn -T2.0C -DskipTests=true -Pmapr -Dhadoop.version= where artifact version
   matches your MapR release.

   5. Deploy nifi as per documentation,

   6. modify bootstrap.conf so that it contains

*java.arg.15=-Djava.security.auth.login.config=/opt/mapr/conf/mapr.login.conf
   *
   7. create a core-site.xml with the following content:

   **




*   hadoop.security hybrid
  *
   *fs.defaultFS*
   *maprfs:///*
   *  *


* *
   8. the PutHDFS processor and point to the core-site.xml above, creating
   a kerberos keytab as customary

   9. test


Re: Idea needed to get XL from SharePoint using NiFi

2016-09-21 Thread Aldrin Piri
Hi Ram,

Accomplishing this task would require some effort beyond those items
currently bundled with NiFi.

I am not well versed in Sharepoint internals, but it seems like there is an
available REST API [1] that would allow you to access files and folders
[2].  Depending on the nuances of authentication, InvokeHTTP may be a
reasonable solution to both query an endpoint and serve as a means for
finding additional information to get specific files and folders.  Doing
some quick searching on top of that, it appears there are a few libraries
available that may facilitate acquiring files from Sharepoint which could
be used in a custom processor to streamline the process.

Once you are able to access Sharepoint and gather the documents of
interest, some transform would be needed from the xls(x) format to CSV.  I
have a vague recollection of someone in the community working with Apache
POI [3] to facilitate this process but cannot seem to find the reference
with some searching.  This would also require custom processor development.

The suite of ExecuteProcess, ExecuteStreamCommand, ExecuteScript,
InvokeScripted processors may provide some additional avenues for
exploration if you currently have applications that accomplish some of this
presently or have comfort working with JVM-friendly scripting languages.

The final steps of saving to disk and/or loading a database are supported
and should cover your needs beyond the data capture and transformation.

[1] https://msdn.microsoft.com/EN-US/library/office/fp142380.aspx
[2] https://msdn.microsoft.com/en-us/library/office/dn292553.aspx
[3] http://poi.apache.org/

On Mon, Sep 19, 2016 at 4:58 PM, Nathamuni, Ramanujam 
wrote:

> Hello:
>
>
>
> I need to get the XL files on windows SharePoint 2013  and store as CSV
> file or load into database? Please give me the things to try out?
>
>
>
> Thanks,
>
> Ram
>
>
> *
> This e-mail may contain confidential or privileged information.
> If you are not the intended recipient, please notify the sender
> immediately and then delete it.
>
> TIAA
> *
>


Re: Requesting Obscene FlowFile Batch Sizes

2016-09-21 Thread Joe Witt
It would buy time but either way it becomes a magic value people have
to know about.  This is not unlike the SplitText scenario where we
recommend doing two-phase splits.  The problem is that for the
ProcessSession we hold information about the flowfiles (not their
content) in memory and the provenance events in memory.  When we're
talking hundreds of thousands or more events in a session that adds up
really quick.  Users should not need to know/worry about this sort of
thing.  We need to have a way to prestage these things to the
respective repositories (provenance/flowfile) so this can go back to
where it belongs as a framework concern.  Easier said that done but a
good goal for us to have.

Peter's use case is a good one to rally around as they way he wanted
it to work is reasonable and intuitive and we should try to make that
happen.

Thanks
Joe

On Tue, Sep 20, 2016 at 5:29 PM, Andy LoPresto  wrote:
> Bryan,
>
> That’s a good point. Would running with a larger Java heap and higher swap
> threshold allow Peter to get larger batches out?
>
> Andy LoPresto
> alopre...@apache.org
> alopresto.apa...@gmail.com
> PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69
>
> On Sep 20, 2016, at 1:41 PM, Bryan Bende  wrote:
>
> Peter,
>
> Does 10k happen to be your swap threshold in nifi.properties by any chance
> (it defaults to 20k I believe)?
>
> I suspect the behavior you are seeing could be due to the way swapping
> works, but Mark or others could probably confirm.
>
> I found this thread where Mark explained how swapping works with a
> background thread, and I believe it still works this way:
> http://apache-nifi.1125220.n5.nabble.com/Nifi-amp-Spark-receiver-performance-configuration-td524.html
>
> -Bryan
>
> On Tue, Sep 20, 2016 at 10:22 AM, Peter Wicks (pwicks) 
> wrote:
>>
>> I’m using JSONToSQL, followed by PutSQL.  I’m using Teradata, which
>> supports a special JDBC mode called FastLoad, designed for a minimum of
>> 100,000 rows of data per batch.
>>
>>
>>
>> What I’m finding is that when PutSQL requests a new batch of FlowFiles
>> from the queue, which has over 1 million rows in it, with a batch size of
>> 100, it always returns a maximum of 10k.  How can I get my obscenely
>> sized batch request to return all the FlowFile’s I’m asking for?
>>
>>
>>
>> Thanks,
>>
>>   Peter
>
>
>


RE: Does NiFi support multiple queries

2016-09-21 Thread Peter Wicks (pwicks)
Karthik,

PutSQL will handle both styles, and for multiple tables, without issue.

Internally it creates a separate SQL Batch for each distinct SQL statement in 
the queue and then executes these batches separately.  Feel free to mix as many 
Inserts/Updates as you wish for as many tables as you wish.

Thanks,
  Peter

From: Karthik Ramakrishnan [mailto:karthik.ramakrishna...@gmail.com]
Sent: Tuesday, September 20, 2016 9:57 PM
To: users@nifi.apache.org
Subject: Does NiFi support multiple queries

Hello -

I was wondering if NiFi can support multiple queries in the same PutSQL 
processor. For example, if an attribute is set to 'update' - will PutSQL run 
the defined update query and next time when it is an 'insert' - it runs the 
insert query. Or should we go ahead and add two separate processors and make a 
decision on the RouteAttributes processor? Any thoughts would be welcome!!

TIA!!

--
Thanks,
Karthik Ramakrishnan
Data Services Intern
Copart Inc.
Contact : +1 (469) 951-8854



Re: Does NiFi support multiple queries

2016-09-21 Thread markap14
Karthik,

You can send them both to the same PutSQL processor. It won't have any problem 
with this. 

Thanks
-Mark

Sent from my iPhone

> On Sep 21, 2016, at 12:07 AM, Karthik Ramakrishnan 
>  wrote:
> 
> Hello -
> 
> I was wondering if NiFi can support multiple queries in the same PutSQL 
> processor. For example, if an attribute is set to 'update' - will PutSQL run 
> the defined update query and next time when it is an 'insert' - it runs the 
> insert query. Or should we go ahead and add two separate processors and make 
> a decision on the RouteAttributes processor? Any thoughts would be welcome!!
> 
> TIA!! 
> 
> -- 
> Thanks,
> Karthik Ramakrishnan
> Data Services Intern
> Copart Inc.
> Contact : +1 (469) 951-8854
> 


Re: Regarding ConsumeIMAP Processor.

2016-09-21 Thread Andre
Prabhu,

The sample message you sent seems to be a multipart MIME message, so if
what you referring to use the part that shows

HI\nHello\n\nWelcome


Then the ExtractEmailAttachments should be able to extract the contents (it
is an attachment, spitting out a flowfile containing the "Sample.txt"
attachment

You would

ConsumeIMAP (get the message from server -> ExtractEmailHeaders (get the
the From, Subject, To, etc from flowfile content and add as attributes ->
ExtractEmailAttachments (extract each individual attachment from the
flowfile).
At this stage you should produce a new flowfile containing Sample.txt which
seems to be the content you want?

Would you also mind uploading the sample email you want to process into a
gist? May be easier to understand what you are trying to achieve.

Cheers


On Wed, Sep 21, 2016 at 6:41 PM, prabhu Mahendran 
wrote:

> Andre,
>
> Thanks for your help.
>
> I have already use those processors and it produces simple message+MIME
> information without RFC 2822 headers.
>
> Could you suggest any other processor to remove MIME information?
>
> Thanks,
>
>
>
> On Wed, Sep 21, 2016 at 10:57 AM, Andre  wrote:
>
>> Prabhu,
>>
>>
>> Would ExtractEmailHeaders[1] and ExtractEmailAttachments[2] cover your
>> use case?
>>
>> https://nifi.apache.org/docs/nifi-docs/components/org.apache
>> .nifi.processors.email.ExtractEmailHeaders/index.html
>> https://nifi.apache.org/docs/nifi-docs/components/org.apache
>> .nifi.processors.email.ExtractEmailAttachments/index.html
>>
>> Cheers
>>
>>
>> On Wed, Sep 21, 2016 at 2:25 PM, prabhu Mahendran <
>> prabhuu161...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am new to the NIFI. I have just use Consume IMAP Processor to retrieve
>>> attachement from mail Server.
>>>
>>> If i use it then i can able to download attachement but that document
>>> having MIME type information with addition of EMail Data like below
>>> screenshot.
>>>
>>>
>>> I need to extract the exact data only but this data comes with some MIME
>>> information.
>>>
>>> Can anyone please help me to extract data only or remove the MIME
>>> information from file?
>>>
>>> Thanks,
>>>
>>>
>>>
>>>
>>>
>>>
>>
>


Re: Regarding ConsumeIMAP Processor.

2016-09-21 Thread prabhu Mahendran
Andre,

Thanks for your help.

I have already use those processors and it produces simple message+MIME
information without RFC 2822 headers.

Could you suggest any other processor to remove MIME information?

Thanks,



On Wed, Sep 21, 2016 at 10:57 AM, Andre  wrote:

> Prabhu,
>
>
> Would ExtractEmailHeaders[1] and ExtractEmailAttachments[2] cover your use
> case?
>
> https://nifi.apache.org/docs/nifi-docs/components/org.
> apache.nifi.processors.email.ExtractEmailHeaders/index.html
> https://nifi.apache.org/docs/nifi-docs/components/org.
> apache.nifi.processors.email.ExtractEmailAttachments/index.html
>
> Cheers
>
>
> On Wed, Sep 21, 2016 at 2:25 PM, prabhu Mahendran  > wrote:
>
>> Hi,
>>
>> I am new to the NIFI. I have just use Consume IMAP Processor to retrieve
>> attachement from mail Server.
>>
>> If i use it then i can able to download attachement but that document
>> having MIME type information with addition of EMail Data like below
>> screenshot.
>>
>>
>> I need to extract the exact data only but this data comes with some MIME
>> information.
>>
>> Can anyone please help me to extract data only or remove the MIME
>> information from file?
>>
>> Thanks,
>>
>>
>>
>>
>>
>>
>