Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package urlscan for openSUSE:Factory checked 
in at 2022-02-27 22:43:15
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/urlscan (Old)
 and      /work/SRC/openSUSE:Factory/.urlscan.new.1958 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "urlscan"

Sun Feb 27 22:43:15 2022 rev:10 rq:957891 version:0.9.9

Changes:
--------
--- /work/SRC/openSUSE:Factory/urlscan/urlscan.changes  2021-11-02 
19:19:53.927117754 +0100
+++ /work/SRC/openSUSE:Factory/.urlscan.new.1958/urlscan.changes        
2022-02-27 22:44:23.274653013 +0100
@@ -1,0 +2,14 @@
+Sun Jan 30 15:23:36 UTC 2022 - Avinesh Kumar <avinesh.ku...@suse.com>
+
+- update to version 0.9.9:
+  * Bugfix in f-string
+  * Updated setup.py after tagging previous version
+
+- update in version 0.9.8:
+  * Update TLD list
+  * Fix #118, crash on navigating to non-existant line
+  * Pylint fixes
+  * Move to subprocess run instead of call and Popen
+  * Switch to f-strings
+
+-------------------------------------------------------------------

Old:
----
  urlscan-0.9.7.tar.gz

New:
----
  urlscan-0.9.9.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ urlscan.spec ++++++
--- /var/tmp/diff_new_pack.V2qSW5/_old  2022-02-27 22:44:23.674653129 +0100
+++ /var/tmp/diff_new_pack.V2qSW5/_new  2022-02-27 22:44:23.682653132 +0100
@@ -1,7 +1,7 @@
 #
 # spec file for package urlscan
 #
-# Copyright (c) 2021 SUSE LLC
+# Copyright (c) 2022 SUSE LLC
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -18,7 +18,7 @@
 
 %define python_flavor python3
 Name:           urlscan
-Version:        0.9.7
+Version:        0.9.9
 Release:        0
 Summary:        An other URL extractor/viewer
 License:        GPL-2.0-or-later

++++++ urlscan-0.9.7.tar.gz -> urlscan-0.9.9.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/urlscan-0.9.7/README.md new/urlscan-0.9.9/README.md
--- old/urlscan-0.9.7/README.md 2021-10-06 20:10:59.000000000 +0200
+++ new/urlscan-0.9.9/README.md 2022-01-29 05:38:35.000000000 +0100
@@ -14,9 +14,7 @@
 mailreader to allow you to easily launch a Web browser for URLs contained in
 email messages. It is a replacement for the "urlview" program.
 
-*NOTE* The last version that is Python 2 compatible is 0.9.3.
-
-Requires: Python 3.6+ and the python-urwid library
+Requires: Python 3.7+ and the python-urwid library
 
 ## Features
 
@@ -67,6 +65,8 @@
 - Scan certain email headers for URLs. Currently `Link`, `Archived-At` and
   `List-*` are scanned when `--headers` is passed.
 
+- Queue multiple URLs for opening and open them all at once with `a` and `o`.
+
 ## Installation and setup
 
 To install urlscan, install from your distribution repositories (Archlinux),
@@ -84,10 +84,10 @@
 
 Once urlscan is installed, add the following lines to your .muttrc:
 
-    macro index,pager \\cb "<pipe-message> urlscan<Enter>" "call urlscan to
+    macro index,pager \cb "<pipe-message> urlscan<Enter>" "call urlscan to
     extract URLs out of a message"
 
-    macro attach,compose \\cb "<pipe-entry> urlscan<Enter>" "call urlscan to
+    macro attach,compose \cb "<pipe-entry> urlscan<Enter>" "call urlscan to
     extract URLs out of a message"
 
 Once this is done, Control-b while reading mail in mutt will automatically
@@ -152,6 +152,7 @@
 
 The follow actions are supported:
 
+- `add_url` -- add a URL to the queue (default: `a`)
 - `all_escape` -- toggle unescape all URLs (default: `u`)
 - `all_shorten` -- toggle shorten all URLs (default: `S`)
 - `bottom` -- move cursor to last item (default: `G`)
@@ -159,9 +160,12 @@
 - `clipboard` -- copy highlighted URL to clipboard using xsel/xclip (default: 
`C`)
 - `clipboard_pri` -- copy highlighted URL to primary selection using 
xsel/xclip (default: `P`)
 - `context` -- show/hide context (default: `c`)
+- `del_url` -- delete URL from the queue (default: `d`)
 - `down` -- cursor down (default: `j`)
 - `help_menu` -- show/hide help menu (default: `F1`)
 - `link_handler` -- cycle link handling (webbrowser, xdg-open, --run-safe or 
--run) (default: `l`)
+- `open_queue` -- open all URLs in queue (default: `o`)
+- `open_queue_win` -- open all URLs in queue in new window (default: `O`)
 - `open_url` -- open selected URL (default: `space` or `enter`)
 - `palette` -- cycle through palettes (default: `p`)
 - `quit` -- quit (default: `q` or `Q`)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/urlscan-0.9.7/setup.py new/urlscan-0.9.9/setup.py
--- old/urlscan-0.9.7/setup.py  2021-10-06 20:10:59.000000000 +0200
+++ new/urlscan-0.9.9/setup.py  2022-01-29 05:38:35.000000000 +0100
@@ -6,19 +6,19 @@
 
 def long_description():
     """Generate long description from README"""
-    with open("README.md") as readme:
+    with open("README.md", encoding='utf-8') as readme:
         return readme.read()
 
 
 setup(name="urlscan",
-      version="0.9.7",
+      version="0.9.9",
       description="View/select the URLs in an email message or file",
       long_description=long_description(),
       long_description_content_type="text/markdown",
       author="Scott Hansen",
       author_email="firecat4...@gmail.com",
       url="https://github.com/firecat53/urlscan";,
-      download_url="https://github.com/firecat53/urlscan/archive/0.9.6.zip";,
+      download_url="https://github.com/firecat53/urlscan/archive/0.9.9.zip";,
       packages=['urlscan'],
       entry_points={
           'console_scripts': ['urlscan=urlscan.__main__:main']
@@ -35,10 +35,10 @@
           'License :: OSI Approved :: GNU General Public License v2 (GPLv2)',
           'Operating System :: OS Independent',
           'Programming Language :: Python',
-          'Programming Language :: Python :: 3.6',
           'Programming Language :: Python :: 3.7',
           'Programming Language :: Python :: 3.8',
           'Programming Language :: Python :: 3.9',
+          'Programming Language :: Python :: 3.10',
           'Topic :: Utilities'],
       keywords="urlscan, urlview, email, mutt, tmux"
       )
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/urlscan-0.9.7/urlscan/assets/tlds-alpha-by-domain.txt 
new/urlscan-0.9.9/urlscan/assets/tlds-alpha-by-domain.txt
--- old/urlscan-0.9.7/urlscan/assets/tlds-alpha-by-domain.txt   2021-10-06 
20:10:59.000000000 +0200
+++ new/urlscan-0.9.9/urlscan/assets/tlds-alpha-by-domain.txt   2022-01-29 
05:38:35.000000000 +0100
@@ -1,4 +1,4 @@
-# Version 2020051401, Last Updated Fri May 15 07:07:01 2020 UTC
+# Version 2021100600, Last Updated Wed Oct  6 07:07:01 2021 UTC
 AAA
 AARP
 ABARTH
@@ -33,7 +33,6 @@
 AGENCY
 AI
 AIG
-AIGO
 AIRBUS
 AIRFORCE
 AIRTEL
@@ -48,6 +47,7 @@
 ALSACE
 ALSTOM
 AM
+AMAZON
 AMERICANEXPRESS
 AMERICANFAMILY
 AMEX
@@ -212,7 +212,6 @@
 CARS
 CASA
 CASE
-CASEIH
 CASH
 CASINO
 CAT
@@ -224,7 +223,6 @@
 CBS
 CC
 CD
-CEB
 CENTER
 CEO
 CERN
@@ -392,7 +390,6 @@
 ES
 ESQ
 ESTATE
-ESURANCE
 ET
 ETISALAT
 EU
@@ -463,7 +460,6 @@
 FRONTIER
 FTR
 FUJITSU
-FUJIXEROX
 FUN
 FUND
 FURNITURE
@@ -614,7 +610,6 @@
 INSURANCE
 INSURE
 INT
-INTEL
 INTERNATIONAL
 INTUIT
 INVESTMENTS
@@ -630,11 +625,9 @@
 IT
 ITAU
 ITV
-IVECO
 JAGUAR
 JAVA
 JCB
-JCP
 JE
 JEEP
 JETZT
@@ -750,7 +743,6 @@
 LTDA
 LU
 LUNDBECK
-LUPIN
 LUXE
 LUXURY
 LV
@@ -786,7 +778,6 @@
 MEN
 MENU
 MERCKMSD
-METLIFE
 MG
 MH
 MIAMI
@@ -840,7 +831,6 @@
 NAB
 NAGOYA
 NAME
-NATIONWIDE
 NATURA
 NAVY
 NBA
@@ -853,7 +843,6 @@
 NETWORK
 NEUSTAR
 NEW
-NEWHOLLAND
 NEWS
 NEXT
 NEXTDIRECT
@@ -901,7 +890,6 @@
 ONG
 ONL
 ONLINE
-ONYOURSIDE
 OOO
 OPEN
 ORACLE
@@ -1020,11 +1008,9 @@
 RICH
 RICHARDLI
 RICOH
-RIGHTATHOME
 RIL
 RIO
 RIP
-RMIT
 RO
 ROCHER
 ROCKS
@@ -1071,7 +1057,6 @@
 SCHWARZ
 SCIENCE
 SCJOHNSON
-SCOR
 SCOT
 SD
 SE
@@ -1103,7 +1088,6 @@
 SHOUJI
 SHOW
 SHOWTIME
-SHRIRAM
 SI
 SILK
 SINA
@@ -1133,10 +1117,10 @@
 SONG
 SONY
 SOY
+SPA
 SPACE
 SPORT
 SPOT
-SPREADBETTING
 SR
 SRL
 SS
@@ -1165,12 +1149,10 @@
 SUZUKI
 SV
 SWATCH
-SWIFTCOVER
 SWISS
 SX
 SY
 SYDNEY
-SYMANTEC
 SYSTEMS
 SZ
 TAB
@@ -1351,6 +1333,7 @@
 XN--45BR5CYL
 XN--45BRJ9C
 XN--45Q11C
+XN--4DBRK0CE
 XN--4GBRIM
 XN--54B7FTA0CC
 XN--55QW42G
@@ -1376,6 +1359,7 @@
 XN--C1AVG
 XN--C2BR7G
 XN--CCK2B3B
+XN--CCKWCXETD
 XN--CG4BKI
 XN--CLCHC0EA0B2G2A9GCD
 XN--CZR694B
@@ -1411,12 +1395,12 @@
 XN--J1AEF
 XN--J1AMH
 XN--J6W193G
+XN--JLQ480N2RG
 XN--JLQ61U9W7B
 XN--JVR189M
 XN--KCRX77D1X4A
 XN--KPRW13D
 XN--KPRY57D
-XN--KPU716F
 XN--KPUT3I
 XN--L1ACC
 XN--LGBBAT1AD8J
@@ -1457,7 +1441,6 @@
 XN--OTU796D
 XN--P1ACF
 XN--P1AI
-XN--PBT977C
 XN--PGBS0DH
 XN--PSSY2U
 XN--Q7CE6A
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/urlscan-0.9.7/urlscan/urlchoose.py 
new/urlscan-0.9.9/urlscan/urlchoose.py
--- old/urlscan-0.9.7/urlscan/urlchoose.py      2021-10-06 20:10:59.000000000 
+0200
+++ new/urlscan-0.9.9/urlscan/urlchoose.py      2022-01-29 05:38:35.000000000 
+0100
@@ -23,8 +23,9 @@
 from os.path import dirname, exists, expanduser
 import re
 import shlex
-from subprocess import call, Popen, PIPE, DEVNULL
+import subprocess
 import sys
+from threading import Thread
 import webbrowser
 
 import urwid
@@ -81,7 +82,7 @@
 
     """
     if search:
-        pat = re.compile("({})".format(re.escape(search)), re.IGNORECASE)
+        pat = re.compile(f"({re.escape(search)})", re.IGNORECASE)
     else:
         return text
     final = pat.split(text)
@@ -106,9 +107,11 @@
                      '7': self._digits,
                      '8': self._digits,
                      '9': self._digits,
+                     'a': self._add_url,
                      'C': self._clipboard,
                      'c': self._context,
                      'ctrl l': self._clear_screen,
+                     'd': self._del_url,
                      'f1': self._help_menu,
                      'G': self._bottom,
                      'g': self._top,
@@ -116,6 +119,8 @@
                      'k': self._up,
                      'P': self._clipboard_pri,
                      'l': self._link_handler,
+                     'o': self._open_queue,
+                     'O': self._open_queue_win,
                      'p': self._palette,
                      'Q': self._quit,
                      'q': self._quit,
@@ -146,20 +151,20 @@
                ('urlref:url', 'white', 'black', 'standout'),
                ('url:sel', 'black', 'light gray', 'bold')]
         # Boruch's colorized palette
-        colorized =[('header','brown','black','standout'),
-                    ('footer','white','dark red','standout'),
-                    ('search','white','dark green','standout'),
-                    ('msgtext','light cyan','black'),
-                    ('msgtext:ellipses','light gray','black'),
-                    ('urlref:number:braces','light gray','black'),
-                    ('urlref:number','yellow','black','standout'),
-                    ('urlref:url','dark green','black','standout'),
-                    ('url:sel','white','black','')]
+        colorized = [('header', 'brown', 'black', 'standout'),
+                     ('footer', 'white', 'dark red', 'standout'),
+                     ('search', 'white', 'dark green', 'standout'),
+                     ('msgtext', 'light cyan', 'black'),
+                     ('msgtext:ellipses', 'light gray', 'black'),
+                     ('urlref:number:braces', 'light gray', 'black'),
+                     ('urlref:number', 'yellow', 'black', 'standout'),
+                     ('urlref:url', 'dark green', 'black', 'standout'),
+                     ('url:sel', 'white', 'black', '')]
         self.palettes.update([("default", default), ("bw", blw), ("colorized", 
colorized)])
         if genconf is True:
             self._config_create()
         try:
-            with open(self.conf, 'r') as conf_file:
+            with open(self.conf, 'r', encoding=sys.getdefaultencoding()) as 
conf_file:
                 data = json.load(conf_file)
                 try:
                     for pal_name, pal in data['palettes'].items():
@@ -172,7 +177,7 @@
                         if value:
                             if value == "open_url":
                                 urwid.Button._command_map._command[key] = 
'activate'
-                            value = getattr(self, "_{}".format(value))
+                            value = getattr(self, f"_{value}")
                         else:
                             del self.keys[key]
                             continue
@@ -182,12 +187,13 @@
         except FileNotFoundError:
             pass
         try:
-            call(['xdg-open'], stdout=DEVNULL)
+            subprocess.run(['xdg-open'], check=False, 
stdout=subprocess.DEVNULL)
             self.xdg = True
         except OSError:
             self.xdg = False
         self.shorten = shorten
         self.compact = compact
+        self.queue = []
         self.run = run
         self.runsafe = runsafe
         self.single = single
@@ -219,7 +225,8 @@
         self.header = (":: F1 - help/keybindings :: "
                        "q - quit :: "
                        "/ - search :: "
-                       "URL opening mode - {}")
+                       "URL opening mode - {} :: "
+                       "Queue - {}")
         self.link_open_modes = ["Web Browser", "Xdg-Open"] if self.xdg is True 
else ["Web Browser"]
         if self.runsafe:
             self.link_open_modes.insert(0, self.runsafe)
@@ -228,7 +235,7 @@
         self.nohelp = nohelp
         if nohelp is False:
             self.headerwid = urwid.AttrMap(urwid.Text(
-                self.header.format(self.link_open_modes[0])), 'header')
+                self.header.format(self.link_open_modes[0], len(self.queue))), 
'header')
         else:
             self.headerwid = None
         self.top = urwid.Frame(listbox, self.headerwid)
@@ -270,7 +277,7 @@
         """
         for j, k in enumerate(keys):
             if self.search is True:
-                text = "Search: {}".format(self.search_string)
+                text = f"Search: {self.search_string}"
                 if k == 'enter':
                     # Catch 'enter' key to prevent opening URL in mkbrowseto
                     self.enter = True
@@ -339,19 +346,84 @@
     def _open_url(self):
         """<Enter> or <space>"""
         load_text = "Loading URL..." if self.link_open_modes[0] != (self.run 
or self.runsafe) \
-            else "Executing: {}".format(self.run or self.runsafe)
+            else f"Executing: {self.run or self.runsafe}"
         if os.environ.get('BROWSER') not in ['elinks', 'links', 'w3m', 'lynx']:
             self._footer_display(load_text, 5)
 
+    def _background_queue(self, mode):
+        """Open URLs in background"""
+        for url in self.queue:
+            self.mkbrowseto(url, thread=True, mode=mode)()
+        self.draw_screen()
+
+    def _queue(self, mode=2):
+        """Open all URLs in queue
+
+            Args: mode - 2 for new tab, 1 for new window
+
+        """
+        load_text = "Loading URLs in queue..." \
+            if self.link_open_modes[0] != (self.run or self.runsafe) \
+            else f"Executing: {self.run or self.runsafe}"
+        if os.environ.get('BROWSER') in ['elinks', 'links', 'w3m', 'lynx']:
+            self._footer_display("Opening multiple links not support in text 
browsers", 5)
+        else:
+            self._footer_display(load_text, 5)
+        thr = Thread(target=self._background_queue, args=(mode,))
+        thr.start()
+        self.queue = []
+        self.headerwid = urwid.AttrMap(urwid.Text(
+            self.header.format(self.link_open_modes[0], len(self.queue))), 
'header')
+        self.top.base_widget.header = self.headerwid
+
+    def _open_queue(self):
+        """o (new tab)"""
+        if self.queue:
+            self._queue()
+
+    def _open_queue_win(self):
+        """O (new window)"""
+        if self.queue:
+            self._queue(1)
+
+    def _add_url(self):
+        """a"""
+        fpo = self.top.base_widget.body.focus_position
+        url_idx = len([i for i in self.items[:fpo + 1]
+                       if isinstance(i, urwid.Columns)]) - 1
+        if self.compact is False and fpo <= 1:
+            return
+        self.queue.append(self.urls[url_idx])
+        self.queue = list(set(self.queue))
+        self.headerwid = urwid.AttrMap(urwid.Text(
+            self.header.format(self.link_open_modes[0], len(self.queue))), 
'header')
+        self.top.base_widget.header = self.headerwid
+
+    def _del_url(self):
+        """d"""
+        fpo = self.top.base_widget.body.focus_position
+        url_idx = len([i for i in self.items[:fpo + 1]
+                       if isinstance(i, urwid.Columns)]) - 1
+        if self.compact is False and fpo <= 1:
+            return
+        try:
+            self.queue.remove(self.urls[url_idx])
+            self.headerwid = urwid.AttrMap(urwid.Text(
+                self.header.format(self.link_open_modes[0], len(self.queue))), 
'header')
+            self.top.base_widget.header = self.headerwid
+        except ValueError:
+            pass
+
     def _help_menu(self):
         """F1"""
         if self.help_menu is False:
             self.focus_pos_saved = self.top.base_widget.body.focus_position
-            help_men = "\n".join(["{} - {}".format(i, j.__name__.strip('_'))
+            help_men = "\n".join([f"{i} - {j.__name__.strip('_')}"
                                   for i, j in self.keys.items() if j.__name__ 
!=
                                   '_digits'])
             help_men = "KEYBINDINGS\n" + help_men + "\n<0-9> - Jump to item"
             docs = ("OPTIONS\n"
+                    "add_url       -- add URL to queue\n"
                     "all_escape    -- toggle unescape all URLs\n"
                     "all_shorten   -- toggle shorten all URLs\n"
                     "bottom        -- move cursor to last item\n"
@@ -362,10 +434,13 @@
                     "                 selection using xsel/xclip\n"
                     "config_create -- create ~/.config/urlscan/config.json\n"
                     "context       -- show/hide context\n"
+                    "del_url       -- delete URL from queue\n"
                     "down          -- cursor down\n"
                     "help_menu     -- show/hide help menu\n"
                     "link_handler  -- cycle through xdg-open, webbrowser \n"
                     "                 and user-defined function\n"
+                    "open_queue    -- open all URLs in queue\n"
+                    "open_queue_win-- open all URLs in queue in new window\n"
                     "open_url      -- open selected URL\n"
                     "palette       -- cycle through palettes\n"
                     "quit          -- quit\n"
@@ -413,9 +488,11 @@
                     self.items.index(self.items[max(int(self.number) - 1, 0)])
         except IndexError:
             self.number = self.number[:-1]
+        except ValueError:
+            pass
         self.top.base_widget.keypress(self.size, "")  # Trick urwid into 
redisplaying the cursor
         if self.number:
-            self._footer_display("Selection: {}".format(self.number), 1)
+            self._footer_display(f"Selection: {self.number}", 1)
 
     def _clear_screen(self):
         """ Ctrl-l """
@@ -524,10 +601,13 @@
         cmds = COPY_COMMANDS_PRIMARY if pri else COPY_COMMANDS
         for cmd in cmds:
             try:
-                proc = Popen(shlex.split(cmd), stdin=PIPE, stdout=DEVNULL, 
stderr=DEVNULL)
-                proc.communicate(input=url.encode(sys.getdefaultencoding()))
-                self._footer_display("Copied url to {} selection".format(
-                    "primary" if pri is True else "clipboard"), 5)
+                subprocess.run(shlex.split(cmd),
+                               check=False,
+                               input=url.encode(sys.getdefaultencoding()),
+                               stdout=subprocess.DEVNULL,
+                               stderr=subprocess.DEVNULL)
+                self._footer_display("Copied url to "
+                                     f"{'primary' if pri is True else 
'clipboard'} selection", 5)
             except OSError:
                 continue
             if self.single is True:
@@ -557,7 +637,7 @@
             os.makedirs(dirname(expanduser(self.conf)), exist_ok=True)
             keys = dict(zip(self.keys.keys(),
                             [i.__name__.strip('_') for i in 
self.keys.values()]))
-            with open(expanduser(self.conf), 'w') as pals:
+            with open(expanduser(self.conf), 'w', 
encoding=sys.getdefaultencoding()) as pals:
                 pals.writelines(json.dumps({"palettes": self.palettes, "keys": 
keys},
                                            indent=4))
             print("Created ~/.config/urlscan/config.json")
@@ -579,7 +659,7 @@
 
         """
         self.number = ""  # Clear URL selection number
-        text = "Search: {}".format(self.search_string)
+        text = f"Search: {self.search_string}"
         if self.search_string:
             footer = 'search'
         else:
@@ -604,7 +684,7 @@
         """ Search - search URLs and text.
 
         """
-        text = "Search: {}".format(self.search_string)
+        text = f"Search: {self.search_string}"
         footerwid = urwid.AttrMap(urwid.Text(text), 'footer')
         self.top.base_widget.footer = footerwid
         search_items = []
@@ -661,15 +741,19 @@
         self.link_open_modes.insert(0, mode)
         if self.nohelp is False:
             self.headerwid = urwid.AttrMap(urwid.Text(
-                self.header.format(self.link_open_modes[0])), 'header')
+                self.header.format(self.link_open_modes[0], len(self.queue))), 
'header')
             self.top.base_widget.header = self.headerwid
 
-    def mkbrowseto(self, url):
+    def mkbrowseto(self, url, thread=False, mode=0):
         """Create the urwid callback function to open the web browser or call
         another function with the URL.
 
         """
-        def browse(*args):
+        def browse(*args):  # pylint: disable=unused-argument
+            # These 3 lines prevent any stderr messages from webbrowser or xdg
+            savout = os.dup(2)
+            os.close(2)
+            os.open(os.devnull, os.O_RDWR)
             # double ()() to ensure self.search evaluated at runtime, not when
             # browse() is _created_. [0] is self.search, [1] is self.enter
             # self.enter prevents opening URL when in search mode
@@ -678,26 +762,30 @@
                     self.search = False
                     self.enter = False
             elif self.link_open_modes[0] == "Web Browser":
-                webbrowser.open(url)
+                webbrowser.open(url, new=mode)
             elif self.link_open_modes[0] == "Xdg-Open":
-                run = 'xdg-open "{}"'.format(url)
-                process = Popen(shlex.split(run), stdout=PIPE, stdin=PIPE)
+                subprocess.run(shlex.split(f'xdg-open "{url}"'), check=False)
             elif self.link_open_modes[0] == self.runsafe:
                 if self.pipe:
-                    process = Popen(shlex.split(self.runsafe), stdout=PIPE, 
stdin=PIPE)
-                    
process.communicate(input=url.encode(sys.getdefaultencoding()))
+                    subprocess.run(shlex.split(self.runsafe),
+                                   check=False,
+                                   input=url.encode(sys.getdefaultencoding()))
                 else:
                     cmd = [i.format(url) for i in shlex.split(self.runsafe)]
-                    Popen(cmd).communicate()
+                    subprocess.run(cmd, check=False)
             elif self.link_open_modes[0] == self.run and self.pipe:
-                process = Popen(shlex.split(self.run), stdout=PIPE, stdin=PIPE)
-                process.communicate(input=url.encode(sys.getdefaultencoding()))
+                subprocess.run(shlex.split(self.run),
+                               check=False,
+                               input=url.encode(sys.getdefaultencoding()))
             else:
-                Popen(self.run.format(url), shell=True).communicate()
+                subprocess.run(self.run.format(url), check=False, shell=True)
 
             if self.single is True:
                 self._quit()
-            self.draw_screen()
+            # Restore normal stderr
+            os.dup2(savout, 2)
+            if thread is False:
+                self.draw_screen()
         return browse
 
     def process_urls(self, extractedurls, dedupe, shorten):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/urlscan-0.9.7/urlscan/urlscan.py 
new/urlscan-0.9.9/urlscan/urlscan.py
--- old/urlscan-0.9.7/urlscan/urlscan.py        2021-10-06 20:10:59.000000000 
+0200
+++ new/urlscan-0.9.9/urlscan/urlscan.py        2022-01-29 05:38:35.000000000 
+0100
@@ -22,6 +22,7 @@
 import locale
 import os
 import re
+from sys import getdefaultencoding
 
 
 class Chunk:
@@ -40,8 +41,7 @@
         self.url = url
 
     def __str__(self):
-        return 'Chunk(markup = %s, url= %s)' % (repr(self.markup),
-                                                repr(self.url))
+        return f'Chunk(markup = {repr(self.markup)}, url= {repr(self.url)})'
 
     def __repr__(self):
         return self.__str__()
@@ -115,12 +115,11 @@
             if tag == 'ul':
                 depth = len([t for t in self.list_stack if t[0] == tag])
                 ul_tags = HTMLChunker.ul_tags
-                chunk = Chunk('%s  ' % (ul_tags[depth % len(ul_tags)]),
-                              self.cur_url())
+                chunk = Chunk(f"{ul_tags[depth % len(ul_tags)]}  ", 
self.cur_url())
             else:
                 counter = self.list_stack[-1][1]
                 self.list_stack[-1] = (tag, counter + 1)
-                chunk = Chunk("%2d." % counter, self.cur_url())
+                chunk = Chunk(f"{counter:2d}.", self.cur_url())
             self.add_chunk(chunk)
         else:
             self.end_para()
@@ -226,7 +225,7 @@
         elif char in HTMLChunker.extrachars:
             name = HTMLChunker.extrachars[char]
         else:
-            name = '&#%s;' % name
+            name = f"&#{name};"
         self.handle_data(name)
 
     entities = {'nbsp': ' ',
@@ -243,7 +242,7 @@
         else:
             # If you see a reference, it needs to be
             # added above.
-            self.handle_data('&%s;' % name)
+            self.handle_data(f"&{name};")
 
 
 URLINTERNALPATTERN = r'[{}()@\w/\\\-%?!&.=:;+,#~]'
@@ -260,7 +259,7 @@
     file = os.path.join(os.path.dirname(__file__),
                         'assets',
                         'tlds-alpha-by-domain.txt')
-    with open(file) as fobj:
+    with open(file, encoding=getdefaultencoding()) as fobj:
         return [elem for elem in fobj.read().lower().splitlines()[1:]
                 if "--" not in elem]
 
@@ -316,7 +315,7 @@
         else:
             email = match.group("email")
             if email and "mailto" not in email:
-                mailto = "mailto:{}".format(email)
+                mailto = f"mailto:{email}";
             else:
                 mailto = match.group(1)
             rval.append(Chunk(None, mailto))
@@ -412,7 +411,7 @@
     # lines with more than one entry or one entry that's
     # a URL are the only lines containing URLs.
 
-    linechunks = [parse_text_urls(l, regex=regex) for l in lines]
+    linechunks = [parse_text_urls(i, regex=regex) for i in lines]
 
     return extract_with_context(linechunks,
                                 lambda chunk: len(chunk) > 1 or
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/urlscan-0.9.7/urlscan.1 new/urlscan-0.9.9/urlscan.1
--- old/urlscan-0.9.7/urlscan.1 2021-10-06 20:10:59.000000000 +0200
+++ new/urlscan-0.9.9/urlscan.1 2022-01-29 05:38:35.000000000 +0100
@@ -66,6 +66,8 @@
 \fB11.\fR Scan certain email headers for URLs. Currently \fBLink\fR,
 \fBArchived-At\fR and \fBList-*\fR are scanned when \fB--headers\fR is passed.
 
+\fB12.\fR Queue multiple URLs for opening and open them all at once with 
\fBa\fR and \fBo\fR.
+
 .SH OPTIONS
 .TP
 .B \-c, \-\-compact
@@ -162,6 +164,8 @@
 
 The follow actions are supported:
 .TP
+\fBadd_url\fR \-\- add a URL to the queue (Default: \fBa\fR)
+.TP
 \fBall_escape\fR \-\- toggle unescape all URLs (Default: \fBu\fR)
 .TP
 \fBall_shorten\fR \-\- toggle shorten all URLs (Default: \fBS\fR)
@@ -176,12 +180,18 @@
 .TP
 \fBcontext\fR \-\- show/hide context (Default: \fBc\fR)
 .TP
+\fBdel_url\fR \-\- delete URL from the queue (Default: \fBd\fR)
+.TP
 \fBdown\fR \-\- cursor down (Default: \fBj\fR)
 .TP
 \fBhelp_menu\fR \-\- show/hide help menu (Default: \fBF1\fR)
 .TP
 \fBlink_handler\fR \-\- cycle link handling (webbrowser, xdg-open or custom) 
(Default: \fBl\fR)
 .TP
+\fBopen_queue\fR \-\- open all URLs in queue (Default: \fBo\fR)
+.TP
+\fBopen_queue_win\fR \-\- open all URLs in queue in new window (Default: 
\fBO\fR)
+.TP
 \fBopen_url\fR \-\- open selected URL (Default: \fBspace\fR or \fBenter\fR)
 .TP
 \fBpalette\fR \-\- cycle through palettes (Default: \fBp\fR)

Reply via email to