Skip to content
GitLab
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
Aron Helser
ParaView
Commits
aa6a47fe
Commit
aa6a47fe
authored
Aug 30, 2016
by
Ben Boeckel
⛰
Browse files
Merge branch 'upstream-pygments' into update-pygments
* upstream-pygments: pygments 2016-08-30 (4e05ce0e)
parents
7edf6cb3
36cbc559
Pipeline
#28425
failed with stage
Changes
68
Pipelines
1
Expand all
Hide whitespace changes
Inline
Side-by-side
ThirdParty/pygments/vtkpygments/AUTHORS
0 → 100644
View file @
aa6a47fe
Pygments is written and maintained by Georg Brandl <georg@python.org>.
Major developers are Tim Hatch <tim@timhatch.com> and Armin Ronacher
<armin.ronacher@active-4.com>.
Other contributors, listed alphabetically, are:
* Sam Aaron -- Ioke lexer
* Ali Afshar -- image formatter
* Thomas Aglassinger -- Easytrieve, JCL, Rexx and Transact-SQL lexers
* Muthiah Annamalai -- Ezhil lexer
* Kumar Appaiah -- Debian control lexer
* Andreas Amann -- AppleScript lexer
* Timothy Armstrong -- Dart lexer fixes
* Jeffrey Arnold -- R/S, Rd, BUGS, Jags, and Stan lexers
* Jeremy Ashkenas -- CoffeeScript lexer
* José Joaquín Atria -- Praat lexer
* Stefan Matthias Aust -- Smalltalk lexer
* Lucas Bajolet -- Nit lexer
* Ben Bangert -- Mako lexers
* Max Battcher -- Darcs patch lexer
* Thomas Baruchel -- APL lexer
* Tim Baumann -- (Literate) Agda lexer
* Paul Baumgart, 280 North, Inc. -- Objective-J lexer
* Michael Bayer -- Myghty lexers
* Thomas Beale -- Archetype lexers
* John Benediktsson -- Factor lexer
* Trevor Bergeron -- mIRC formatter
* Vincent Bernat -- LessCSS lexer
* Christopher Bertels -- Fancy lexer
* Sébastien Bigaret -- QVT Operational lexer
* Jarrett Billingsley -- MiniD lexer
* Adam Blinkinsop -- Haskell, Redcode lexers
* Frits van Bommel -- assembler lexers
* Pierre Bourdon -- bugfixes
* Matthias Bussonnier -- ANSI style handling for terminal-256 formatter
* chebee7i -- Python traceback lexer improvements
* Hiram Chirino -- Scaml and Jade lexers
* Mauricio Caceres -- SAS and Stata lexers.
* Ian Cooper -- VGL lexer
* David Corbett -- Inform, Jasmin, JSGF, Snowball, and TADS 3 lexers
* Leaf Corcoran -- MoonScript lexer
* Christopher Creutzig -- MuPAD lexer
* Daniël W. Crompton -- Pike lexer
* Pete Curry -- bugfixes
* Bryan Davis -- EBNF lexer
* Bruno Deferrari -- Shen lexer
* Giedrius Dubinskas -- HTML formatter improvements
* Owen Durni -- Haxe lexer
* Alexander Dutton, Oxford University Computing Services -- SPARQL lexer
* James Edwards -- Terraform lexer
* Nick Efford -- Python 3 lexer
* Sven Efftinge -- Xtend lexer
* Artem Egorkine -- terminal256 formatter
* Matthew Fernandez -- CAmkES lexer
* Michael Ficarra -- CPSA lexer
* James H. Fisher -- PostScript lexer
* William S. Fulton -- SWIG lexer
* Carlos Galdino -- Elixir and Elixir Console lexers
* Michael Galloy -- IDL lexer
* Naveen Garg -- Autohotkey lexer
* Laurent Gautier -- R/S lexer
* Alex Gaynor -- PyPy log lexer
* Richard Gerkin -- Igor Pro lexer
* Alain Gilbert -- TypeScript lexer
* Alex Gilding -- BlitzBasic lexer
* Bertrand Goetzmann -- Groovy lexer
* Krzysiek Goj -- Scala lexer
* Andrey Golovizin -- BibTeX lexers
* Matt Good -- Genshi, Cheetah lexers
* Michał Górny -- vim modeline support
* Alex Gosse -- TrafficScript lexer
* Patrick Gotthardt -- PHP namespaces support
* Olivier Guibe -- Asymptote lexer
* Jordi Gutiérrez Hermoso -- Octave lexer
* Florian Hahn -- Boogie lexer
* Martin Harriman -- SNOBOL lexer
* Matthew Harrison -- SVG formatter
* Steven Hazel -- Tcl lexer
* Dan Michael Heggø -- Turtle lexer
* Aslak Hellesøy -- Gherkin lexer
* Greg Hendershott -- Racket lexer
* Justin Hendrick -- ParaSail lexer
* David Hess, Fish Software, Inc. -- Objective-J lexer
* Varun Hiremath -- Debian control lexer
* Rob Hoelz -- Perl 6 lexer
* Doug Hogan -- Mscgen lexer
* Ben Hollis -- Mason lexer
* Max Horn -- GAP lexer
* Alastair Houghton -- Lexer inheritance facility
* Tim Howard -- BlitzMax lexer
* Dustin Howett -- Logos lexer
* Ivan Inozemtsev -- Fantom lexer
* Hiroaki Itoh -- Shell console rewrite, Lexers for PowerShell session,
MSDOS session, BC, WDiff
* Brian R. Jackson -- Tea lexer
* Christian Jann -- ShellSession lexer
* Dennis Kaarsemaker -- sources.list lexer
* Dmitri Kabak -- Inferno Limbo lexer
* Igor Kalnitsky -- vhdl lexer
* Alexander Kit -- MaskJS lexer
* Pekka Klärck -- Robot Framework lexer
* Gerwin Klein -- Isabelle lexer
* Eric Knibbe -- Lasso lexer
* Stepan Koltsov -- Clay lexer
* Adam Koprowski -- Opa lexer
* Benjamin Kowarsch -- Modula-2 lexer
* Domen Kožar -- Nix lexer
* Oleh Krekel -- Emacs Lisp lexer
* Alexander Kriegisch -- Kconfig and AspectJ lexers
* Marek Kubica -- Scheme lexer
* Jochen Kupperschmidt -- Markdown processor
* Gerd Kurzbach -- Modelica lexer
* Jon Larimer, Google Inc. -- Smali lexer
* Olov Lassus -- Dart lexer
* Matt Layman -- TAP lexer
* Kristian Lyngstøl -- Varnish lexers
* Sylvestre Ledru -- Scilab lexer
* Chee Sing Lee -- Flatline lexer
* Mark Lee -- Vala lexer
* Valentin Lorentz -- C++ lexer improvements
* Ben Mabey -- Gherkin lexer
* Angus MacArthur -- QML lexer
* Louis Mandel -- X10 lexer
* Louis Marchand -- Eiffel lexer
* Simone Margaritelli -- Hybris lexer
* Kirk McDonald -- D lexer
* Gordon McGregor -- SystemVerilog lexer
* Stephen McKamey -- Duel/JBST lexer
* Brian McKenna -- F# lexer
* Charles McLaughlin -- Puppet lexer
* Lukas Meuser -- BBCode formatter, Lua lexer
* Cat Miller -- Pig lexer
* Paul Miller -- LiveScript lexer
* Hong Minhee -- HTTP lexer
* Michael Mior -- Awk lexer
* Bruce Mitchener -- Dylan lexer rewrite
* Reuben Morais -- SourcePawn lexer
* Jon Morton -- Rust lexer
* Paulo Moura -- Logtalk lexer
* Mher Movsisyan -- DTD lexer
* Dejan Muhamedagic -- Crmsh lexer
* Ana Nelson -- Ragel, ANTLR, R console lexers
* Kurt Neufeld -- Markdown lexer
* Nam T. Nguyen -- Monokai style
* Jesper Noehr -- HTML formatter "anchorlinenos"
* Mike Nolta -- Julia lexer
* Jonas Obrist -- BBCode lexer
* Edward O'Callaghan -- Cryptol lexer
* David Oliva -- Rebol lexer
* Pat Pannuto -- nesC lexer
* Jon Parise -- Protocol buffers and Thrift lexers
* Benjamin Peterson -- Test suite refactoring
* Ronny Pfannschmidt -- BBCode lexer
* Dominik Picheta -- Nimrod lexer
* Andrew Pinkham -- RTF Formatter Refactoring
* Clément Prévost -- UrbiScript lexer
* Oleh Prypin -- Crystal lexer (based on Ruby lexer)
* Elias Rabel -- Fortran fixed form lexer
* raichoo -- Idris lexer
* Kashif Rasul -- CUDA lexer
* Justin Reidy -- MXML lexer
* Norman Richards -- JSON lexer
* Corey Richardson -- Rust lexer updates
* Lubomir Rintel -- GoodData MAQL and CL lexers
* Andre Roberge -- Tango style
* Georg Rollinger -- HSAIL lexer
* Michiel Roos -- TypoScript lexer
* Konrad Rudolph -- LaTeX formatter enhancements
* Mario Ruggier -- Evoque lexers
* Miikka Salminen -- Lovelace style, Hexdump lexer, lexer enhancements
* Stou Sandalski -- NumPy, FORTRAN, tcsh and XSLT lexers
* Matteo Sasso -- Common Lisp lexer
* Joe Schafer -- Ada lexer
* Ken Schutte -- Matlab lexers
* Sebastian Schweizer -- Whiley lexer
* Tassilo Schweyer -- Io, MOOCode lexers
* Ted Shaw -- AutoIt lexer
* Joerg Sieker -- ABAP lexer
* Robert Simmons -- Standard ML lexer
* Kirill Simonov -- YAML lexer
* Corbin Simpson -- Monte lexer
* Alexander Smishlajev -- Visual FoxPro lexer
* Steve Spigarelli -- XQuery lexer
* Jerome St-Louis -- eC lexer
* Camil Staps -- Clean and NuSMV lexers
* James Strachan -- Kotlin lexer
* Tom Stuart -- Treetop lexer
* Colin Sullivan -- SuperCollider lexer
* Ben Swift -- Extempore lexer
* Edoardo Tenani -- Arduino lexer
* Tiberius Teng -- default style overhaul
* Jeremy Thurgood -- Erlang, Squid config lexers
* Brian Tiffin -- OpenCOBOL lexer
* Bob Tolbert -- Hy lexer
* Matthias Trute -- Forth lexer
* Erick Tryzelaar -- Felix lexer
* Alexander Udalov -- Kotlin lexer improvements
* Thomas Van Doren -- Chapel lexer
* Daniele Varrazzo -- PostgreSQL lexers
* Abe Voelker -- OpenEdge ABL lexer
* Pepijn de Vos -- HTML formatter CTags support
* Matthias Vallentin -- Bro lexer
* Benoît Vinot -- AMPL lexer
* Linh Vu Hong -- RSL lexer
* Nathan Weizenbaum -- Haml and Sass lexers
* Nathan Whetsell -- Csound lexers
* Dietmar Winkler -- Modelica lexer
* Nils Winter -- Smalltalk lexer
* Davy Wybiral -- Clojure lexer
* Whitney Young -- ObjectiveC lexer
* Diego Zamboni -- CFengine3 lexer
* Enrique Zamudio -- Ceylon lexer
* Alex Zimin -- Nemerle lexer
* Rob Zimmerman -- Kal lexer
* Vincent Zurczak -- Roboconf lexer
Many thanks for all contributions!
ThirdParty/pygments/vtkpygments/CHANGES
0 → 100644
View file @
aa6a47fe
This diff is collapsed.
Click to expand it.
ThirdParty/pygments/vtkpygments/LICENSE
0 → 100644
View file @
aa6a47fe
Copyright (c) 2006-2015 by the respective authors (see AUTHORS file).
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
ThirdParty/pygments/vtkpygments/pygmentize
0 → 100755
View file @
aa6a47fe
#!/usr/bin/env python2
import
sys
import
pygments.cmdline
try
:
sys
.
exit
(
pygments
.
cmdline
.
main
(
sys
.
argv
))
except
KeyboardInterrupt
:
sys
.
exit
(
1
)
ThirdParty/pygments/vtkpygments/pygments/__init__.py
0 → 100644
View file @
aa6a47fe
# -*- coding: utf-8 -*-
"""
Pygments
~~~~~~~~
Pygments is a syntax highlighting package written in Python.
It is a generic syntax highlighter for general use in all kinds of software
such as forum systems, wikis or other applications that need to prettify
source code. Highlights are:
* a wide range of common languages and markup formats is supported
* special attention is paid to details, increasing quality by a fair amount
* support for new languages and formats are added easily
* a number of output formats, presently HTML, LaTeX, RTF, SVG, all image
formats that PIL supports, and ANSI sequences
* it is usable as a command-line tool and as a library
* ... and it highlights even Brainfuck!
The `Pygments tip`_ is installable with ``easy_install Pygments==dev``.
.. _Pygments tip:
http://bitbucket.org/birkenfeld/pygments-main/get/tip.zip#egg=Pygments-dev
:copyright: Copyright 2006-2015 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import
sys
from
pygments.util
import
StringIO
,
BytesIO
__version__
=
'2.2a0'
__docformat__
=
'restructuredtext'
__all__
=
[
'lex'
,
'format'
,
'highlight'
]
def
lex
(
code
,
lexer
):
"""
Lex ``code`` with ``lexer`` and return an iterable of tokens.
"""
try
:
return
lexer
.
get_tokens
(
code
)
except
TypeError
as
err
:
if
(
isinstance
(
err
.
args
[
0
],
str
)
and
(
'unbound method get_tokens'
in
err
.
args
[
0
]
or
'missing 1 required positional argument'
in
err
.
args
[
0
])):
raise
TypeError
(
'lex() argument must be a lexer instance, '
'not a class'
)
raise
def
format
(
tokens
,
formatter
,
outfile
=
None
):
# pylint: disable=redefined-builtin
"""
Format a tokenlist ``tokens`` with the formatter ``formatter``.
If ``outfile`` is given and a valid file object (an object
with a ``write`` method), the result will be written to it, otherwise
it is returned as a string.
"""
try
:
if
not
outfile
:
realoutfile
=
getattr
(
formatter
,
'encoding'
,
None
)
and
BytesIO
()
or
StringIO
()
formatter
.
format
(
tokens
,
realoutfile
)
return
realoutfile
.
getvalue
()
else
:
formatter
.
format
(
tokens
,
outfile
)
except
TypeError
as
err
:
if
(
isinstance
(
err
.
args
[
0
],
str
)
and
(
'unbound method format'
in
err
.
args
[
0
]
or
'missing 1 required positional argument'
in
err
.
args
[
0
])):
raise
TypeError
(
'format() argument must be a formatter instance, '
'not a class'
)
raise
def
highlight
(
code
,
lexer
,
formatter
,
outfile
=
None
):
"""
Lex ``code`` with ``lexer`` and format it with the formatter ``formatter``.
If ``outfile`` is given and a valid file object (an object
with a ``write`` method), the result will be written to it, otherwise
it is returned as a string.
"""
return
format
(
lex
(
code
,
lexer
),
formatter
,
outfile
)
if
__name__
==
'__main__'
:
# pragma: no cover
from
pygments.cmdline
import
main
sys
.
exit
(
main
(
sys
.
argv
))
ThirdParty/pygments/vtkpygments/pygments/cmdline.py
0 → 100644
View file @
aa6a47fe
# -*- coding: utf-8 -*-
"""
pygments.cmdline
~~~~~~~~~~~~~~~~
Command line interface.
:copyright: Copyright 2006-2015 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
from
__future__
import
print_function
import
sys
import
getopt
from
textwrap
import
dedent
from
pygments
import
__version__
,
highlight
from
pygments.util
import
ClassNotFound
,
OptionError
,
docstring_headline
,
\
guess_decode
,
guess_decode_from_terminal
,
terminal_encoding
from
pygments.lexers
import
get_all_lexers
,
get_lexer_by_name
,
guess_lexer
,
\
get_lexer_for_filename
,
find_lexer_class_for_filename
from
pygments.lexers.special
import
TextLexer
from
pygments.formatters.latex
import
LatexEmbeddedLexer
,
LatexFormatter
from
pygments.formatters
import
get_all_formatters
,
get_formatter_by_name
,
\
get_formatter_for_filename
,
find_formatter_class
from
pygments.formatters.terminal
import
TerminalFormatter
from
pygments.filters
import
get_all_filters
,
find_filter_class
from
pygments.styles
import
get_all_styles
,
get_style_by_name
USAGE
=
"""
\
Usage: %s [-l <lexer> | -g] [-F <filter>[:<options>]] [-f <formatter>]
[-O <options>] [-P <option=value>] [-s] [-v] [-o <outfile>] [<infile>]
%s -S <style> -f <formatter> [-a <arg>] [-O <options>] [-P <option=value>]
%s -L [<which> ...]
%s -N <filename>
%s -H <type> <name>
%s -h | -V
Highlight the input file and write the result to <outfile>.
If no input file is given, use stdin, if -o is not given, use stdout.
If -s is passed, lexing will be done in "streaming" mode, reading and
highlighting one line at a time. This will only work properly with
lexers that have no constructs spanning multiple lines!
<lexer> is a lexer name (query all lexer names with -L). If -l is not
given, the lexer is guessed from the extension of the input file name
(this obviously doesn't work if the input is stdin). If -g is passed,
attempt to guess the lexer from the file contents, or pass through as
plain text if this fails (this can work for stdin).
Likewise, <formatter> is a formatter name, and will be guessed from
the extension of the output file name. If no output file is given,
the terminal formatter will be used by default.
With the -O option, you can give the lexer and formatter a comma-
separated list of options, e.g. ``-O bg=light,python=cool``.
The -P option adds lexer and formatter options like the -O option, but
you can only give one option per -P. That way, the option value may
contain commas and equals signs, which it can't with -O, e.g.
``-P "heading=Pygments, the Python highlighter".
With the -F option, you can add filters to the token stream, you can
give options in the same way as for -O after a colon (note: there must
not be spaces around the colon).
The -O, -P and -F options can be given multiple times.
With the -S option, print out style definitions for style <style>
for formatter <formatter>. The argument given by -a is formatter
dependent.
The -L option lists lexers, formatters, styles or filters -- set
`which` to the thing you want to list (e.g. "styles"), or omit it to
list everything.
The -N option guesses and prints out a lexer name based solely on
the given filename. It does not take input or highlight anything.
If no specific lexer can be determined "text" is returned.
The -H option prints detailed help for the object <name> of type <type>,
where <type> is one of "lexer", "formatter" or "filter".
The -s option processes lines one at a time until EOF, rather than
waiting to process the entire file. This only works for stdin, and
is intended for streaming input such as you get from 'tail -f'.
Example usage: "tail -f sql.log | pygmentize -s -l sql"
The -v option prints a detailed traceback on unhandled exceptions,
which is useful for debugging and bug reports.
The -h option prints this help.
The -V option prints the package version.
"""
def
_parse_options
(
o_strs
):
opts
=
{}
if
not
o_strs
:
return
opts
for
o_str
in
o_strs
:
if
not
o_str
.
strip
():
continue
o_args
=
o_str
.
split
(
','
)
for
o_arg
in
o_args
:
o_arg
=
o_arg
.
strip
()
try
:
o_key
,
o_val
=
o_arg
.
split
(
'='
,
1
)
o_key
=
o_key
.
strip
()
o_val
=
o_val
.
strip
()
except
ValueError
:
opts
[
o_arg
]
=
True
else
:
opts
[
o_key
]
=
o_val
return
opts
def
_parse_filters
(
f_strs
):
filters
=
[]
if
not
f_strs
:
return
filters
for
f_str
in
f_strs
:
if
':'
in
f_str
:
fname
,
fopts
=
f_str
.
split
(
':'
,
1
)
filters
.
append
((
fname
,
_parse_options
([
fopts
])))
else
:
filters
.
append
((
f_str
,
{}))
return
filters
def
_print_help
(
what
,
name
):
try
:
if
what
==
'lexer'
:
cls
=
get_lexer_by_name
(
name
)
print
(
"Help on the %s lexer:"
%
cls
.
name
)
print
(
dedent
(
cls
.
__doc__
))
elif
what
==
'formatter'
:
cls
=
find_formatter_class
(
name
)
print
(
"Help on the %s formatter:"
%
cls
.
name
)
print
(
dedent
(
cls
.
__doc__
))
elif
what
==
'filter'
:
cls
=
find_filter_class
(
name
)
print
(
"Help on the %s filter:"
%
name
)
print
(
dedent
(
cls
.
__doc__
))
return
0
except
(
AttributeError
,
ValueError
):
print
(
"%s not found!"
%
what
,
file
=
sys
.
stderr
)
return
1
def
_print_list
(
what
):
if
what
==
'lexer'
:
print
()
print
(
"Lexers:"
)
print
(
"~~~~~~~"
)
info
=
[]
for
fullname
,
names
,
exts
,
_
in
get_all_lexers
():
tup
=
(
', '
.
join
(
names
)
+
':'
,
fullname
,
exts
and
'(filenames '
+
', '
.
join
(
exts
)
+
')'
or
''
)
info
.
append
(
tup
)
info
.
sort
()
for
i
in
info
:
print
((
'* %s
\n
%s %s'
)
%
i
)
elif
what
==
'formatter'
:
print
()
print
(
"Formatters:"
)
print
(
"~~~~~~~~~~~"
)
info
=
[]
for
cls
in
get_all_formatters
():
doc
=
docstring_headline
(
cls
)
tup
=
(
', '
.
join
(
cls
.
aliases
)
+
':'
,
doc
,
cls
.
filenames
and
'(filenames '
+
', '
.
join
(
cls
.
filenames
)
+
')'
or
''
)
info
.
append
(
tup
)
info
.
sort
()
for
i
in
info
:
print
((
'* %s
\n
%s %s'
)
%
i
)
elif
what
==
'filter'
:
print
()
print
(
"Filters:"
)
print
(
"~~~~~~~~"
)
for
name
in
get_all_filters
():
cls
=
find_filter_class
(
name
)
print
(
"* "
+
name
+
':'
)
print
(
" %s"
%
docstring_headline
(
cls
))
elif
what
==
'style'
:
print
()
print
(
"Styles:"
)
print
(
"~~~~~~~"
)
for
name
in
get_all_styles
():
cls
=
get_style_by_name
(
name
)
print
(
"* "
+
name
+
':'
)
print
(
" %s"
%
docstring_headline
(
cls
))
def
main_inner
(
popts
,
args
,
usage
):
opts
=
{}
O_opts
=
[]
P_opts
=
[]
F_opts
=
[]
for
opt
,
arg
in
popts
:
if
opt
==
'-O'
:
O_opts
.
append
(
arg
)
elif
opt
==
'-P'
:
P_opts
.
append
(
arg
)
elif
opt
==
'-F'
:
F_opts
.
append
(
arg
)
opts
[
opt
]
=
arg
if
opts
.
pop
(
'-h'
,
None
)
is
not
None
:
print
(
usage
)
return
0
if
opts
.
pop
(
'-V'
,
None
)
is
not
None
:
print
(
'Pygments version %s, (c) 2006-2015 by Georg Brandl.'
%
__version__
)
return
0
# handle ``pygmentize -L``
L_opt
=
opts
.
pop
(
'-L'
,
None
)
if
L_opt
is
not
None
:
if
opts
:
print
(
usage
,
file
=
sys
.
stderr
)
return
2
# print version
main
([
''
,
'-V'
])
if
not
args
:
args
=
[
'lexer'
,
'formatter'
,
'filter'
,
'style'
]
for
arg
in
args
:
_print_list
(
arg
.
rstrip
(
's'
))
return
0
# handle ``pygmentize -H``
H_opt
=
opts
.
pop
(
'-H'
,
None
)
if
H_opt
is
not
None
:
if
opts
or
len
(
args
)
!=
2
:
print
(
usage
,
file
=
sys
.
stderr
)
return
2
what
,
name
=
args
# pylint: disable=unbalanced-tuple-unpacking
if
what
not
in
(
'lexer'
,
'formatter'
,
'filter'
):
print
(
usage
,
file
=
sys
.
stderr
)
return
2
return
_print_help
(
what
,
name
)
# parse -O options
parsed_opts
=
_parse_options
(
O_opts
)
opts
.
pop
(
'-O'
,
None
)