Is there a reason why just using the command line tool isn't workable for you? 
Personally, I'm happy when I can just do something quick like that.

Also you can simplify your command with sort filename | uniq.

-rory

On May 6, 2011, at 9:45 AM, Roman Valls wrote:

> Hey galaxy users,
> 
> Thats a fairly good question from one of my colleagues. I've looked
> through the menus (mainly "Text Manipulation" and "Filter and
> Sort"(Select)), googled (on the mailing list archives too), but couldn't
> find an answer: How should I remove duplicates on plain text files
> without resorting to:
> 
> "echo file|sort|uniq" before uploading the file/text.
> 
> or
> 
> Putting a regexp together to replace the duplicate occurences as in:
> 
> http://www.regular-expressions.info/duplicatelines.html
> 
> 
> I'm pretty sure I'm missing some really basic stuff here... is this
> basic operation something supposed to be done outside galaxy perhaps ?
> 
> Thanks in advance !
> 
> ___________________________________________________________
> The Galaxy User list should be used for the discussion of
> Galaxy analysis and other features on the public server
> at usegalaxy.org.  Please keep all replies on the list by
> using "reply all" in your mail client.  For discussion of
> local Galaxy instances and the Galaxy source code, please
> use the Galaxy Development list:
> 
>  http://lists.bx.psu.edu/listinfo/galaxy-dev
> 
> To manage your subscriptions to this and other Galaxy lists,
> please use the interface at:
> 
>  http://lists.bx.psu.edu/


___________________________________________________________
The Galaxy User list should be used for the discussion of
Galaxy analysis and other features on the public server
at usegalaxy.org.  Please keep all replies on the list by
using "reply all" in your mail client.  For discussion of
local Galaxy instances and the Galaxy source code, please
use the Galaxy Development list:

  http://lists.bx.psu.edu/listinfo/galaxy-dev

To manage your subscriptions to this and other Galaxy lists,
please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to