update prezto

This commit is contained in:
2018-11-18 23:37:19 +04:00
parent a17ecf9f57
commit 2808949f26
352 changed files with 15169 additions and 25328 deletions

View File

@@ -33,10 +33,11 @@ version is 4.3.11.
done done
``` ```
Note: If you already have any of the given config files, ln will error. In Note: If you already have any of the given configuration files, `ln` will
simple cases you can add `source "${ZDOTDIR:-$HOME}/.zprezto/init.zsh"` to cause error. In simple cases you can load prezto by adding the line
the bottom of your `.zshrc` to load prezto but keep your config intact. For `source "${ZDOTDIR:-$HOME}/.zprezto/init.zsh"` to the bottom of your
more complicated setups, it is recommended that you back up your original `.zshrc` and keep the rest of your Zsh configuration intact. For more
complicated setups, it is recommended that you back up your original
configs and replace them with the provided prezto runcoms. configs and replace them with the provided prezto runcoms.
4. Set Zsh as your default shell: 4. Set Zsh as your default shell:
@@ -89,6 +90,10 @@ accompanying README files to learn of what is available.
window or tab. window or tab.
![sorin theme][2] ![sorin theme][2]
Note that the 'git' module may be required for special symbols to appear,
such as those on the right of the above image. Add `'git'` to the `pmodule`
list (under `zstyle ':prezto:load' pmodule \` in your *~/.zpreztorc*) to
enable this module.
### External Modules ### External Modules
@@ -99,7 +104,7 @@ accompanying README files to learn of what is available.
Note that module names need to be unique or they will cause an error when Note that module names need to be unique or they will cause an error when
loading. loading.
```console ```sh
zstyle ':prezto:load' pmodule-dirs $HOME/.zprezto-contrib zstyle ':prezto:load' pmodule-dirs $HOME/.zprezto-contrib
``` ```

View File

@@ -31,9 +31,9 @@ installed:
- *.7z* requires `7za`. - *.7z* requires `7za`.
- *.deb* requires `ar`, `tar`. - *.deb* requires `ar`, `tar`.
Additionally, if `pigz` and/or `pbzip2` are installed, `archive` will use them over Additionally, if `pigz` and/or `pbzip2` are installed, `archive` will use them
their traditional counterparts, `gzip` and `bzip2` respectively, to take full advantage over their traditional counterparts, `gzip` and `bzip2` respectively, to take
of all available CPU cores for compression. full advantage of all available CPU cores for compression.
Alternatives Alternatives
------------ ------------

View File

@@ -1,8 +1,3 @@
Autosuggestions
---------------
Integrates zsh-autosuggestions into Prezto.
Autosuggestions Autosuggestions
=============== ===============

View File

@@ -5,9 +5,10 @@ When you try to use a command that is not available locally, searches
the package manager for a package offering that command and suggests the package manager for a package offering that command and suggests
the proper install command. the proper install command.
Debian-based and Arch Linux-based distributions use the [`command-not-found`][1] tool. Debian and Arch Linux based distributions use the [`command-not-found`][1] tool.
macOS uses Homebrew's [`command-not-found` clone][2]. Note that you also need to [follow the instructions to tap the `command-not-found` homebrew repository][3]. macOS uses Homebrew's [`command-not-found` clone][2]. Note that you also need to
[follow the instructions][3] to tap the `command-not-found` homebrew repository.
Authors Authors

View File

@@ -6,6 +6,19 @@ the [zsh-completions][1] project.
This module must be loaded **after** the *utility* module. This module must be loaded **after** the *utility* module.
Options
-------
- `COMPLETE_IN_WORD` complete from both ends of a word.
- `ALWAYS_TO_END` move cursor to the end of a completed word.
- `PATH_DIRS` perform path search even on command names with slashes.
- `AUTO_MENU` show completion menu on a successive tab press.
- `AUTO_LIST` automatically list choices on ambiguous completion.
- `AUTO_PARAM_SLASH` if completed parameter is a directory, add a trailing slash.
- `EXTENDED_GLOB` needed for file modification glob modifiers with compinit.
- `MENU_COMPLETE` do not autoselect the first completion entry.
- `FLOW_CONTROL` disable start/stop characters in shell editor.
Settings Settings
-------- --------

View File

@@ -0,0 +1,39 @@
# Contributing
## How to Contribute to zsh-completions
Contributions are welcome, just make sure you follow the guidelines:
* Completions are not accepted when already available in zsh.
* Completions are not accepted when already available in their original project.
* Please do not just copy/paste someone else's completion, ask before.
* Partially implemented completions are not accepted.
* Please add a header containing authors, status and origin of the script and license header if you do not wish to use the Zsh license (example [here](src/_ack)).
* Any reasonable open source licence is acceptable but note that we recommend the use of the Zsh license and that you should use it if you hope for the function to migrate to zsh itself.
* Please try to follow the [Zsh completion style guide](https://github.com/zsh-users/zsh/blob/master/Etc/completion-style-guide).
* Please send one separate pull request per file.
* Send a pull request or ask for committer access.
## Contributing Completion Functions to Zsh
The zsh project itself welcomes completion function contributions via
[github pull requests](https://github.com/zsh-users/zsh/),
[gitlab merge requests](https://gitlab.com/zsh-org/zsh/) or via patch
files sent to its mailing list, `zsh-workers@zsh.org`.
Contributing to zsh has the advantage of reaching the most users.
## Including Completion Functions in Upstream Projects
Many upstream projects include zsh completions.
If well maintained, this has the advantage that users get a completion
function that matches the installed version of their software.
If you are the upstream maintainer this is a good choice. If the project
already includes completions for bash, fish, tcsh, etc then they are
likely open to including zsh's too. It can also be a good option for
completions handling commands that are system or distribution specific.
Ideally, arrange for the project's build system to install the
completion function in `$prefix/share/zsh/site-functions`.

View File

@@ -6,11 +6,6 @@ zsh-completions ![GitHub release](https://img.shields.io/github/release/zsh-user
*This projects aims at gathering/developing new completion scripts that are not available in Zsh yet. The scripts may be contributed to the Zsh project when stable enough.* *This projects aims at gathering/developing new completion scripts that are not available in Zsh yet. The scripts may be contributed to the Zsh project when stable enough.*
## Status
See [issues](https://github.com/zsh-users/zsh-completions/issues) for details on each completion definition.
## Usage ## Usage
### Using packages ### Using packages
@@ -20,11 +15,14 @@ See [issues](https://github.com/zsh-users/zsh-completions/issues) for details on
| Debian / Ubuntu | [zsh-completions OBS repository](https://software.opensuse.org/download.html?project=shells%3Azsh-users%3Azsh-completions&package=zsh-completions) | | Debian / Ubuntu | [zsh-completions OBS repository](https://software.opensuse.org/download.html?project=shells%3Azsh-users%3Azsh-completions&package=zsh-completions) |
| Fedora / CentOS / RHEL / Scientific Linux | [zsh-completions OBS repository](https://software.opensuse.org/download.html?project=shells%3Azsh-users%3Azsh-completions&package=zsh-completions) | | Fedora / CentOS / RHEL / Scientific Linux | [zsh-completions OBS repository](https://software.opensuse.org/download.html?project=shells%3Azsh-users%3Azsh-completions&package=zsh-completions) |
| OpenSUSE / SLE | [zsh-completions OBS repository](https://software.opensuse.org/download.html?project=shells%3Azsh-users%3Azsh-completions&package=zsh-completions) | | OpenSUSE / SLE | [zsh-completions OBS repository](https://software.opensuse.org/download.html?project=shells%3Azsh-users%3Azsh-completions&package=zsh-completions) |
| Arch Linux | [zsh-completions](https://www.archlinux.org/packages/zsh-completions), [zsh-completions-git](https://aur.archlinux.org/packages/zsh-completions-git) | | Arch Linux / Manjaro / Antergos / Hyperbola | [zsh-completions](https://www.archlinux.org/packages/zsh-completions), [zsh-completions-git](https://aur.archlinux.org/packages/zsh-completions-git) |
| Gentoo | [app-shells/zsh-completions](http://packages.gentoo.org/package/app-shells/zsh-completions) | | Gentoo / Funtoo | [app-shells/zsh-completions](http://packages.gentoo.org/package/app-shells/zsh-completions) |
| NixOS | [zsh-completions](https://github.com/NixOS/nixpkgs/blob/master/pkgs/shells/zsh-completions/default.nix) | | NixOS | [zsh-completions](https://github.com/NixOS/nixpkgs/blob/master/pkgs/shells/zsh/zsh-completions/default.nix) |
| Void Linux | [zsh-completions](https://github.com/voidlinux/void-packages/tree/master/srcpkgs/zsh-completions) | | Void Linux | [zsh-completions](https://github.com/void-linux/void-packages/blob/master/srcpkgs/zsh-completions/template) |
| Mac OS | [homebrew](https://github.com/Homebrew/homebrew-core/blob/master/Formula/zsh-completions.rb) | | Slackware | [Slackbuilds](https://slackbuilds.org/repository/14.2/system/zsh-completions) |
| Mac OS | [homebrew](https://github.com/Homebrew/homebrew-core/blob/master/Formula/zsh-completions.rb), [MacPorts](https://github.com/macports/macports-ports/blob/master/sysutils/zsh-completions/Portfile) |
| NetBSD | [pkgsrc](ftp://ftp.netbsd.org/pub/pkgsrc/current/pkgsrc/shells/zsh-completions/README.html) |
### Using zsh frameworks ### Using zsh frameworks
@@ -59,18 +57,9 @@ Add `antigen bundle zsh-users/zsh-completions` to your `~/.zshrc`.
### Contributing ### Contributing
Contributions are welcome, just make sure you follow the guidelines: Contributions are welcome, see [CONTRIBUTING](https://github.com/zsh-users/zsh-completions/blob/master/CONTRIBUTING.md).
* Completions are not accepted when already available in zsh.
* Completions are not accepted when already available in their original project.
* Please do not just copy/paste someone else completion, ask before.
* Completions only partially implemented are not accepted.
* Please add a header containing authors, status and origin of the script and license header if you do not wish to use the Zsh license (example [here](src/_ack)).
* Please try to follow [Zsh completion style guide](https://github.com/zsh-users/zsh/blob/master/Etc/completion-style-guide).
* Please send one separate pull request per file.
* Send a pull request or ask for committer access.
## License ## License
Completions use the Zsh license, unless explicitely mentionned in the file header. Completions use the Zsh license, unless explicitly mentioned in the file header.
See [LICENSE](https://github.com/zsh-users/zsh-completions/blob/master/LICENSE) for more information. See [LICENSE](https://github.com/zsh-users/zsh-completions/blob/master/LICENSE) for more information.

View File

@@ -38,30 +38,24 @@
# #
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_afew() { _arguments \
'(- 1 *)-h[display usage information]' \
_arguments -C \ "(-c --classify -l --learn= -t --tag -u --update -U --update-reference -m --move-mails)"{-w,--watch}"[continuously monitor the mailbox for new messages matching the given query]" \
'(- 1 *)'-h"[Show help and exit]" \ "(-c --classify -l --learn= -u --update -U --update-reference -m --move-mails -w --watch)"{-t,--tag}"[run the tag filters]" \
"(-c --classify -l --learn= -t --tag -u --update -U --update-reference -m --move-mails)"{-w,--watch}"[Continuously monitor the mailbox for new messages matching the given query]" \ "(-c --classify -t --tag -u --update -U --update-reference -m --move-mails -w --watch)"{-l,--learn=}"[train category with the messages matching query]" \
"(-c --classify -l --learn= -u --update -U --update-reference -m --move-mails -w --watch)"{-t,--tag}"[Run the tag filters]" \ "(-c --classify -l --learn= -t --tag -U --update-reference -m --move-mails -w --watch)"{-u,--update}"[update the categories (requires no query)]" \
"(-c --classify -t --tag -u --update -U --update-reference -m --move-mails -w --watch)"{-l,--learn=}"[Train category with the messages matching query]" \ "(-c --classify -l --learn= -t --tag -u --update -m --move-mails -w --watch)"{-U,--update-reference}"[update the reference category (takes quite some time) (requires no query)]" \
"(-c --classify -l --learn= -t --tag -U --update-reference -m --move-mails -w --watch)"{-u,--update}"[Update the categories (requires no query)]" \ "(-l --learn= -t --tag -u --update -U --update-reference -m --move-mails -w --watch)"{-c,--classify}"[classify each message matching the iven query]" \
"(-c --classify -l --learn= -t --tag -u --update -m --move-mails -w --watch)"{-U,--update-reference}"[Update the reference category (takes quite some time) (requires no query)]" \ "(-c --classify -l --learn= -t --tag -u --update -U --update-reference -w --watch)"{-m,--move-mails}"[move mail files between maildir folders]" \
"(-l --learn= -t --tag -u --update -U --update-reference -m --move-mails -w --watch)"{-c,--classify}"[Classify each message matching the iven query]" \ "(-n --all)"{-a,--all}"[operate on all email]" \
"(-c --classify -l --learn= -t --tag -u --update -U --update-reference -w --watch)"{-m,--move-mails}"[Move mail files between maildir folders]" \ "(-a --new)"{-n,--new}"[operate on all new email]" \
"(-n --all)"{-a,--all}"[Operate on all email]" \ {-C,--notmuch-config=}"[specify path to notmuch configuration file]:files:_files" \
"(-a --new)"{-n,--new}"[Operate on all new email]" \ {-e,--enable-filters=}"[specify filter classes to use]:filter" \
{-C,--notmuch-config=}"[Path to notmuch configuration file]:files:_files" \ {-d,--dry-run}"[don't change the DB]" \
{-e,--enable-filters=}"[Flter classes to use]:filters" \ {-R,--reference-set-size=}"[specify size of the reference set]:size [1000]" \
{-d,--dry-run}"[Dont change the DB]" \ {-T,--reference-set-timeframe-days=}"[don't use emails older than specified age]:age (days) [30]" \
{-R,--reference-set-size=}"[Size of the reference set (default: 1000)]:size:" \ {--verbose,-v}"[be more verbose]" \
{-T,--reference-set-timeframe-days=}"[Do not use emails older than DAYS days (default: 30)]:days:" \ '*: :_guard "^-*" query'
{--verbose,-v}"[Be more verbose]" \
'*:Query:' \
}
_afew
# Local Variables: # Local Variables:
# mode: Shell-Script # mode: Shell-Script

View File

@@ -40,7 +40,7 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# #
# Needs either ANSIBLE_HOSTS or /etc/ansible/hosts on linux # Needs either ANSIBLE_HOSTS or /etc/ansible/hosts on linux
# (or /usr/local/etc/ansible/hosts on OSX) # (or /usr/local/etc/ansible/hosts on macOS)
# #
# Note 1: the following gist (https://gist.github.com/15ed54a438a36d67fd99.git) # Note 1: the following gist (https://gist.github.com/15ed54a438a36d67fd99.git)
# has some files to help improve the hostfile shell parsing # has some files to help improve the hostfile shell parsing
@@ -416,8 +416,8 @@ _modules=(
'os_user:Manage OpenStack Identity Users' 'os_user:Manage OpenStack Identity Users'
'os_user_group:Associate OpenStack Identity users and groups' 'os_user_group:Associate OpenStack Identity users and groups'
'os_volume:Create/Delete Cinder Volumes' 'os_volume:Create/Delete Cinder Volumes'
'osx_defaults:osx_defaults allows users to read, write, and delete Mac OS X user defaults from Ansible' 'osx_defaults:osx_defaults allows users to read, write, and delete macOS user defaults from Ansible'
'osx_say:Makes an OSX computer to speak.' 'osx_say:Makes a macOS computer speak.'
'ovirt:oVirt/RHEV platform management' 'ovirt:oVirt/RHEV platform management'
'package:Generic OS package manager' 'package:Generic OS package manager'
'pacman:Manage packages with *pacman*' 'pacman:Manage packages with *pacman*'
@@ -643,7 +643,7 @@ _ansible ()
local curcontext="$curcontext" state line local curcontext="$curcontext" state line
typeset -A opt_args typeset -A opt_args
_arguments -C -W \ _arguments -s -C -W \
'1:pattern:->pattern'\ '1:pattern:->pattern'\
"(-a --args)"{-a,--args}"[ARGS module arguments]:arguments:(ARG)"\ "(-a --args)"{-a,--args}"[ARGS module arguments]:arguments:(ARG)"\
'--ask-become-pass[ask for privilege escalation password]'\ '--ask-become-pass[ask for privilege escalation password]'\
@@ -658,14 +658,14 @@ _ansible ()
"(-C --check)"{-C,--check}"[don't make any changes]"\ "(-C --check)"{-C,--check}"[don't make any changes]"\
"(-c --connection)"{-c,--connection}"[CONNECTION connection type to use (default=smart)]:connection type:(smart ssh local chroot)"\ "(-c --connection)"{-c,--connection}"[CONNECTION connection type to use (default=smart)]:connection type:(smart ssh local chroot)"\
"(-D --diff)"{-D,--diff}"[show differences when changing (small) files and templates]"\ "(-D --diff)"{-D,--diff}"[show differences when changing (small) files and templates]"\
"(-e --extra-vars)"{-e,--extra-vars}"[set additional variables as key=value or YAML/JSON]"\ "*"{-e,--extra-vars}"[set additional variables as key=value or YAML/JSON]"\
"(-f --forks)"{-f,--forks}"[FORKS number of parallel processes to use (default=5)]:forks:(5)"\ "(-f --forks)"{-f,--forks}"[FORKS number of parallel processes to use (default=5)]:forks:(5)"\
"(-h --help)"{-h,--help}"[help message]"\ "(-h --help)"{-h,--help}"[help message]"\
"(-i --inventory-file)"{-i,--inventory-file}"[INVENTORY specify inventory host file]:inventory file:_files"\ "*"{-i,--inventory,--inventory-file}"[INVENTORY specify inventory host file]:inventory file:_files"\
"(-l --limit)"{-l,--limit}"[SUBSET further limit selected hosts to an additional pattern]:subset pattern:->pattern"\ "(-l --limit)"{-l,--limit}"[SUBSET further limit selected hosts to an additional pattern]:subset pattern:->pattern"\
'--list-hosts[outputs a list of matching hosts. Does not execute anything else]'\ '--list-hosts[outputs a list of matching hosts. Does not execute anything else]'\
"(-m --module-name)"{-m,--module-name}"[MODULE_NAME module name (default=command)]:module name:->module"\ "(-m --module-name)"{-m,--module-name}"[MODULE_NAME module name (default=command)]:module name:->module"\
"(-M --module-path)"{-M,--module-path}"[MODULE_PATH specify path to module library (default=None)]:module path:_files -/"\ "*"{-M,--module-path}"[MODULE_PATH specify path to module library (default=None)]:module path:_files -/"\
'--new-vault-password-file[new vault password file for rekey]:new vault password file:_files'\ '--new-vault-password-file[new vault password file for rekey]:new vault password file:_files'\
"(-o --one-line)"{-o,--one-line}"[condense output]"\ "(-o --one-line)"{-o,--one-line}"[condense output]"\
'--output[output file name for encrypt or decrypt; use - for stdout]:output file:_files'\ '--output[output file name for encrypt or decrypt; use - for stdout]:output file:_files'\
@@ -683,8 +683,8 @@ _ansible ()
"(-T --timeout)"{-T,--timeout}"[TIMEOUT override the SSH timeout (s) (default=10)]:ssh timeout:(10)"\ "(-T --timeout)"{-T,--timeout}"[TIMEOUT override the SSH timeout (s) (default=10)]:ssh timeout:(10)"\
"(-t --tree)"{-t,--tree}"[OUTPUT_DIRECTORY log output to this directory]:output directory:_files -/"\ "(-t --tree)"{-t,--tree}"[OUTPUT_DIRECTORY log output to this directory]:output directory:_files -/"\
"(-u --user)"{-u,--user}"[REMOTE_USER connect as this user (default=${USER})]:connect as user:(${USER})"\ "(-u --user)"{-u,--user}"[REMOTE_USER connect as this user (default=${USER})]:connect as user:(${USER})"\
"--vault-password-file[VAULT_PASSWORD_FILE vault password file]:vault password file:_files"\ "*--vault-password-file[VAULT_PASSWORD_FILE vault password file]:vault password file:_files"\
"(-v --verbose)"{-v,--verbose}"[verbose mode (-vvv for more, -vvvv to enable connection debugging)]"\ "*"{-v,--verbose}"[verbose mode (-vvv for more, -vvvv to enable connection debugging)]"\
"--version[show program's version number and exit]"\ "--version[show program's version number and exit]"\
case $state in case $state in

View File

@@ -84,7 +84,7 @@ _ansible-galaxy ()
_arguments \ _arguments \
"(-h --help)"{-h,--help}"[help message]" \ "(-h --help)"{-h,--help}"[help message]" \
"(-c --ignore-certs)"{-c,--ignore-certs}"[Ignore SSL certificate validation errors.]" \ "(-c --ignore-certs)"{-c,--ignore-certs}"[Ignore SSL certificate validation errors.]" \
"(-p --roles-path)"{-p,--roles-path}"[ROLES_PATH The path to the directory containing your roles (default: from ansible.cfg)]:roles path:_files -/" \ "*"{-p,--roles-path}"[ROLES_PATH The path to the directory containing your roles (default: from ansible.cfg)]:roles path:_files -/" \
"(-s --server)"{-s,--server}"[API_SERVER The API server destination]:api server:(http://apiserver)" \ "(-s --server)"{-s,--server}"[API_SERVER The API server destination]:api server:(http://apiserver)" \
"(-v --verbose)"{-v,--verbose}"[verbose mode (-vvv for more, -vvvv to enable connection debugging)]" \ "(-v --verbose)"{-v,--verbose}"[verbose mode (-vvv for more, -vvvv to enable connection debugging)]" \
"--version[show program's version number and exit]" \ "--version[show program's version number and exit]" \

View File

@@ -40,7 +40,7 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# #
# Needs either ANSIBLE_HOSTS or /etc/ansible/hosts on linux # Needs either ANSIBLE_HOSTS or /etc/ansible/hosts on linux
# (or /usr/local/etc/ansible/hosts on OSX) # (or /usr/local/etc/ansible/hosts on macOS)
# #
# Note 1: the following gist (https://gist.github.com/15ed54a438a36d67fd99.git) # Note 1: the following gist (https://gist.github.com/15ed54a438a36d67fd99.git)
# has some files to help improve the hostfile shell parsing # has some files to help improve the hostfile shell parsing
@@ -126,8 +126,8 @@ _ansible-playbook ()
local curcontext="$curcontext" state line local curcontext="$curcontext" state line
typeset -A opt_args typeset -A opt_args
_arguments -C -W \ _arguments -s -C -W \
"1:playbook yml file:_files -g '*.yml|*.yaml'"\ "*:playbook yml file:_files -g '*.yml|*.yaml'"\
'--ask-become-pass[ask for privilege escalation password]'\ '--ask-become-pass[ask for privilege escalation password]'\
"(-k --ask-pass)"{-k,--ask-pass}"[ask for connection password]"\ "(-k --ask-pass)"{-k,--ask-pass}"[ask for connection password]"\
'--ask-su-pass[ask for su password (deprecated, use become)]'\ '--ask-su-pass[ask for su password (deprecated, use become)]'\
@@ -139,23 +139,23 @@ _ansible-playbook ()
"(-C --check)"{-C,--check}"[don't make any changes]"\ "(-C --check)"{-C,--check}"[don't make any changes]"\
"(-c --connection)"{-c,--connection}"[CONNECTION connection type to use (default=smart)]:connection type:(smart ssh local chroot)"\ "(-c --connection)"{-c,--connection}"[CONNECTION connection type to use (default=smart)]:connection type:(smart ssh local chroot)"\
"(-D --diff)"{-D,--diff}"[when changing (small files and templates, show the diff in those. Works great with --check)]"\ "(-D --diff)"{-D,--diff}"[when changing (small files and templates, show the diff in those. Works great with --check)]"\
"(-e --extra-vars)"{-e,--extra-vars}"[EXTRA_VARS set additional variables as key=value or YAML/JSON]:extra vars:(EXTRA_VARS)"\ "*"{-e,--extra-vars}"[EXTRA_VARS set additional variables as key=value or YAML/JSON]:extra vars:(EXTRA_VARS)"\
'--flush-cache[clear the fact cache]'\ '--flush-cache[clear the fact cache]'\
'--force-handlers[run handlers even if a task fails]'\ '--force-handlers[run handlers even if a task fails]'\
"(-f --forks)"{-f,--forks}"[FORKS number of parallel processes to use (default=5)]:forks:(5)"\ "(-f --forks)"{-f,--forks}"[FORKS number of parallel processes to use (default=5)]:forks:(5)"\
"(-h --help)"{-h,--help}"[help message]"\ "(-h --help)"{-h,--help}"[help message]"\
"(-i --inventory-file)"{-i,--inventory-file}"[INVENTORY specify inventory host file]:inventory file:_files"\ "*"{-i,--inventory,--inventory-file}"[INVENTORY specify inventory host file]:inventory file:_files"\
"(-l --limit)"{-l,--limit}"[SUBSET further limit selected hosts to an additional pattern]:subset pattern:->pattern"\ "(-l --limit)"{-l,--limit}"[SUBSET further limit selected hosts to an additional pattern]:subset pattern:->pattern"\
'--list-hosts[outputs a list of matching hosts. Does not execute anything else]'\ '--list-hosts[outputs a list of matching hosts. Does not execute anything else]'\
'--list-tags[list all available tags]'\ '--list-tags[list all available tags]'\
'--list-tasks[list all tasks that would be executed]'\ '--list-tasks[list all tasks that would be executed]'\
"(-M --module-path)"{-M,--module-path}"[MODULE_PATH specify path to module library (default=None)]:module path:_files -/"\ "*"{-M,--module-path}"[MODULE_PATH specify path to module library (default=None)]:module path:_files -/"\
'--new-vault-password-file[new vault password file for rekey]:new vault password file:_files'\ '--new-vault-password-file[new vault password file for rekey]:new vault password file:_files'\
'--output[output file name for encrypt or decrypt; use - for stdout]:output file:_files'\ '--output[output file name for encrypt or decrypt; use - for stdout]:output file:_files'\
'--private-key[PRIVATE_KEY_FILE use this file to authenticate the connection]:private key file:_files'\ '--private-key[PRIVATE_KEY_FILE use this file to authenticate the connection]:private key file:_files'\
'--scp-extra-args[specify extra arguments to pass to scp only]'\ '--scp-extra-args[specify extra arguments to pass to scp only]'\
'--sftp-extra-args[specify extra arguments to pass to sftp only]'\ '--sftp-extra-args[specify extra arguments to pass to sftp only]'\
"--skip-tags[SKIP_TAGS only run plays and tasks whose tags do not match these values]:skip tags:(SKIP_TAGS)"\ "*--skip-tags[SKIP_TAGS only run plays and tasks whose tags do not match these values]:skip tags:(SKIP_TAGS)"\
'--ssh-common-args[specify common arguments to pass to sftp/scp/ssh]'\ '--ssh-common-args[specify common arguments to pass to sftp/scp/ssh]'\
'--ssh-extra-args[specify extra arguments to pass to ssh only]'\ '--ssh-extra-args[specify extra arguments to pass to ssh only]'\
"--start-at-task[START_AT start the playbook at the task matching this name]:name:(TASK_NAME)"\ "--start-at-task[START_AT start the playbook at the task matching this name]:name:(TASK_NAME)"\
@@ -165,11 +165,11 @@ _ansible-playbook ()
"(-s --sudo)"{-s,--sudo}"[run operations with sudo (nopasswd) (deprecated, use become)]"\ "(-s --sudo)"{-s,--sudo}"[run operations with sudo (nopasswd) (deprecated, use become)]"\
"(-U --sudo-user)"{-U,--sudo-user}"[SUDO_USER desired sudo user (default=root) (deprecated, use become)]:su user:(root)"\ "(-U --sudo-user)"{-U,--sudo-user}"[SUDO_USER desired sudo user (default=root) (deprecated, use become)]:su user:(root)"\
'--syntax-check[perform a syntax check on the playbook, but do not execute it]'\ '--syntax-check[perform a syntax check on the playbook, but do not execute it]'\
"(-t --tags)"{-t,--tags}"[TAGS only run plays and tasks gagged with these values]:task tags:(TAGS)"\ "*"{-t,--tags}"[TAGS only run plays and tasks gagged with these values]:task tags:(TAGS)"\
"(-T --timeout)"{-T,--timeout}"[TIMEOUT override the SSH timeout (s) (default=10)]:ssh timeout:(10)"\ "(-T --timeout)"{-T,--timeout}"[TIMEOUT override the SSH timeout (s) (default=10)]:ssh timeout:(10)"\
"(-u --user)"{-u,--user}"[REMOTE_USER connect as this user (default=${USER})]:connect as user:(${USER})"\ "(-u --user)"{-u,--user}"[REMOTE_USER connect as this user (default=${USER})]:connect as user:(${USER})"\
"--vault-password-file[VAULT_PASSWORD_FILE vault password file]:vault password file:_files"\ "*--vault-password-file[VAULT_PASSWORD_FILE vault password file]:vault password file:_files"\
"(-v --verbose)"{-v,--verbose}"[verbose mode (-vvv for more, -vvvv to enable connection debugging)]"\ "*"{-v,--verbose}"[verbose mode (-vvv for more, -vvvv to enable connection debugging)]"\
"--version[show program's version number and exit]"\ "--version[show program's version number and exit]"\
case $state in case $state in

View File

@@ -39,9 +39,9 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
local curcontext="$curcontext" state line _packages _opts ret=1 local curcontext="$curcontext" state line _opts ret=1
_arguments -C -A "-v" -A "--version" \ _arguments -C \
'(- 1 *)'{-v,--version}'[display version information]' \ '(- 1 *)'{-v,--version}'[display version information]' \
'1: :->cmds' \ '1: :->cmds' \
'*:: :->args' && ret=0 '*:: :->args' && ret=0
@@ -49,35 +49,34 @@ _arguments -C -A "-v" -A "--version" \
case $state in case $state in
cmds) cmds)
_values "bower command" \ _values "bower command" \
"cache[Manage bower cache]" \ "cache[manage bower cache]" \
"help[Display help information about Bower]" \ "help[display help information about Bower]" \
"home[Opens a package homepage into your favorite browser]" \ "home[opens a package homepage into your favorite browser]" \
"info[Info of a particular package]" \ "info[info of a particular package]" \
"init[Interactively create a bower.json file]" \ "init[interactively create a bower.json file]" \
"install[Install a package locally]" \ "install[install a package locally]" \
"link[Symlink a package folder]" \ "link[symlink a package folder]" \
"list[List local packages - and possible updates]" \ "list[list local packages - and possible updates]" \
"login[Authenticate with GitHub and store credentials]" \ "login[authenticate with GitHub and store credentials]" \
"lookup[Look up a package URL by name]" \ "lookup[look up a package URL by name]" \
"prune[Removes local extraneous packages]" \ "prune[removes local extraneous packages]" \
"register[Register a package]" \ "register[register a package]" \
"search[Search for a package by name]" \ "search[search for a package by name]" \
"update[Update a local package]" \ "update[update a local package]" \
"uninstall[Remove a local package]" \ "uninstall[remove a local package]" \
"unregister[Remove a package from the registry]" \ "unregister[remove a package from the registry]" \
"version[Bump a package version]" "version[bump a package version]" && ret=0
_arguments \ _arguments \
'(--force)--force[Makes various commands more forceful]' \ '(--force)--force[make various commands more forceful]' \
'(--json)--json[Output consumable JSON]' \ '(--json)--json[output consumable JSON]' \
'(--log-level)--log-level[What level of logs to report]' \ '(--log-level)--log-level[what level of logs to report]' \
'(--offline)--offline[Do not hit the network]' \ "(--offline)--offline[don't hit the network]" \
'(--quiet)--quiet[Only output important information]' \ '(--quiet)--quiet[only output important information]' \
'(--silent)--silent[Do not output anything, besides errors]' \ "(--silent)--silent[don't output anything, besides errors]" \
'(--verbose)--verbose[Makes output more verbose]' \ '(--verbose)--verbose[make output more verbose]' \
'(--allow-root)--allow-root[Allows running commands as root]' \ '(--allow-root)--allow-root[allow running commands as root]' \
'(--version)--version[Output Bower version]' \ '(--version)--version[output Bower version]' \
'(--no-color)--no-color[Disable colors]' '(--no-color)--no-color[disable colors]' && ret=0
ret=0
;; ;;
args) args)
case $line[1] in case $line[1] in
@@ -98,54 +97,46 @@ case $state in
'update' \ 'update' \
'uninstall' \ 'uninstall' \
'unregister' \ 'unregister' \
'version' 'version' && ret=0
ret=0
;; ;;
(home|info|init|link|lookup|prune|register|search|unregister) (home|info|init|link|lookup|prune|register|search|unregister)
_arguments \ _arguments \
'(--help)--help[Show help message]' '(--help)--help[show help message]' && ret=0
ret=0
;; ;;
install) install)
_arguments \ _arguments \
'(--force-latest)--force-latest[Force latest version on conflict]' \ '(--force-latest)--force-latest[force latest version on conflict]' \
'(--help)--help[Show help message]' \ '(--help)--help[show help message]' \
'(--production)--production[Do not install project devDependencies]' \ "(--production)--production[don't install project devDependencies]" \
'(--save)--save[Save installed packages into the project''s bower.json dependencies]' \ "(--save)--save[save installed packages into the project's bower.json dependencies]" \
'(--save-dev)--save-dev[Save installed packages into the project''s bower.json devDependencies]' "(--save-dev)--save-dev[save installed packages into the project's bower.json devDependencies]" && ret=0
ret=0
;; ;;
list) list)
_arguments \ _arguments \
'(--help)--help[Show help message]' \ '(--help)--help[show help message]' \
'(--paths)--paths[Generate a simple JSON source mapping]' \ '(--paths)--paths[generate a simple JSON source mapping]' \
'(--relative)--relative[Make paths relative to the directory config property, which defaults to bower_components]' '(--relative)--relative[make paths relative to the directory config property, which defaults to bower_components]' && ret=0
ret=0
;; ;;
login) login)
_arguments \ _arguments \
'(--help)--help[Show help message]' \ '(--help)--help[show help message]' \
'(-t --token)'{-t,--token}'[Pass GitHub auth token (will not prompt for username/password)]' '(-t --token)'{-t,--token}'[Pass GitHub auth token (will not prompt for username/password)]' && ret=0
ret=0
;; ;;
uninstall) uninstall)
_arguments \ _arguments \
'(--help)--help[Show help message]' \ '(--help)--help[show help message]' \
'(--save)--save[Save installed packages into th projects''s bower.json dependencies]' \ "(--save)--save[save installed packages into the project's bower.json dependencies]" \
'(--save-dev)--save-dev[Save installed packages into th projects''s bower.json devDependencies]' "(--save-dev)--save-dev[save installed packages into the project's bower.json devDependencies]" && ret=0
ret=0
;; ;;
update) update)
_arguments \ _arguments \
'(--force-latest)--force-latest[Force latest version on conflict]' \ '(--force-latest)--force-latest[force latest version on conflict]' \
'(--help)--help[Show help message]' \ '(--help)--help[show help message]' \
'(--production)--production[Do not install project devDependencies]' "(--production)--production[don't install project devDependencies]" && ret=0
ret=0
;; ;;
version) version)
_arguments \ _arguments \
'(--message)--message[Custom git commit and tag message]' '(--message)--message[custom git commit and tag message]' && ret=0
ret=0
;; ;;
exec) exec)
_normal && ret=0 _normal && ret=0
@@ -154,7 +145,7 @@ case $state in
_opts=( $(bower help $line[1] | sed -e '/^ \[-/!d; s/^ \[\(-[^=]*\)=.*/\1/') ) _opts=( $(bower help $line[1] | sed -e '/^ \[-/!d; s/^ \[\(-[^=]*\)=.*/\1/') )
_opts+=( $(bower help $line[1] | sed -e '/^ -/!d; s/^ \(-.\), \[\(-[^=]*\)=.*/\1 \2/') ) _opts+=( $(bower help $line[1] | sed -e '/^ -/!d; s/^ \(-.\), \[\(-[^=]*\)=.*/\1 \2/') )
if [[ $_opts != "" ]]; then if [[ $_opts != "" ]]; then
_values 'options' $_opts && ret=0 _values 'option' $_opts && ret=0
fi fi
;; ;;
esac esac

View File

@@ -28,7 +28,7 @@
# Description # Description
# ----------- # -----------
# #
# Completion script for the OSX 'caffeinate' tool (man 8 caffeinate). # Completion script for the macOS 'caffeinate' tool (man 8 caffeinate).
# #
# ------------------------------------------------------------------------- # -------------------------------------------------------------------------
# Authors # Authors

View File

@@ -0,0 +1,325 @@
#compdef ccache -P -value-,CCACHE_*,-default-
# zsh completion script for ccache
# Copyright 2018 CERN for the benefit of the LHCb Collaboration.
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
#
# * Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# * Neither the name of the copyright holder nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# In applying this licence, CERN does not waive the privileges and immunities
# granted to it by virtue of its status as an Intergovernmental Organization
# or submit itself to any jurisdiction.
# allow users to define their better compilers
# inspired by _cmake_compilers
# users could override with
#
# _ccache_compilers() {
# local -a _ccache_compilers
# _ccache_compilers=(gcc g++ clang clang++)
# _wanted compilers expl "compiler" compadd -- $_ccache_compilers
# }
(( $+functions[_ccache_compilers] )) ||
_ccache_compilers() {
_command_names -e
}
_ccache_booleans() {
_message 'There are no "false" values, unset variable to disable'
local description; description=${1:-boolean}
local booleans; booleans=(
'true'
'yes'
)
_describe -t booeans "$description" booleans
}
(( $+functions[_ccache_compressionlevels] )) ||
_ccache_compressionlevels() {
local -a one_nine
one_nine=(1 2 3 4 5 6 7 8 9)
_describe -t onetonine "compression level (if using compression)" one_nine
}
(( $+functions[_ccache_sloppiness] )) ||
_ccache_sloppiness() {
_values -s ',' \
"file_macro[ignore __FILE__]" \
"file_stat_matches[rely on mtimes and ctimes]" \
"include_file_ctime[ignore headers' ctime too new]" \
"include_file_mtime[ignore headers' mtime too new]" \
"no_system_headers[exclude system headers from cache]" \
"pch_defines[be sloppy about #defines in pch]" \
"time_macros[ignore __date__ and __time__]"
}
(( $+functions[_ccache_compilerchecks] )) ||
_ccache_compilerchecks() {
local -a compiler_check_values
compiler_check_values=(
'content: the actual compiler binary'
'mtime: mtime and size of the compiler'
'none: ignore compiler for hashing'
'string\:: any hard coded string (pre-computed version)'
'%compiler%\ -v:any compiler invocation output'
)
_describe -t compilerchecks "compiler information included in the hash" compiler_check_values
}
(( $+functions[_ccache_dirlevels] )) ||
_ccache_dirlevels() {
local -a one_eight
one_eight=(1 2 3 4 5 6 7 8)
_describe -t onetoeight "directory levels in the cache directory" one_eight
}
if [[ "$service" = -value-* ]]; then
case $service in
*CCACHE_*DIR*)
# CCACHE_BASEDIR: relative to which top level paths are hashed
# CCACHE_DIR: where the cache and config are kept
# CCACHE_TEMPDIR: where temporary files are kept
# all: a single path
_path_files -/
;;
*CCACHE_NLEVELS*)
_ccache_dirlevels
;;
*CCACHE_CC*)
_ccache_compilers
;;
*CCACHE_COMPILERCHECK*)
_ccache_compilerchecks
;;
*CCACHE_*COMPRESS*)
_ccache_booleans "write compressed cache"
;;
*CCACHE_COMPRESSLEVEL*)
_ccache_compressionlevels
;;
*CCACHE_EXTENSION*)
_alternative ':set extension for intermediate files: '
;;
*CCACHE_*DIRECT*)
_ccache_booleans "use direct mode"
;;
*CCACHE_*DISABLE*)
_ccache_booleans "disable cache usage"
;;
*CCACHE_EXTRAFILES*)
local sep=:
compset -P "*${sep}"
compset -S "${sep}*" || suf="$sep"
_files "" -r "${sep}"' /\t\t\-' "$@"
;;
*CCACHE_*HARDLINK*)
_ccache_booleans "create hard links rather than copies"
;;
*CCACHE_*HASHDIR*)
_ccache_booleans "include the cwd in the hash"
;;
*CCACHE_IGNOREHEADERS*)
_dir_list
;;
*CCACHE_*COMMENTS*)
_ccache_booleans "consider comments in hashing"
;;
*CCACHE_LIMIT_MULTIPLE*)
_alternative ":clean up down to level (e.g. 0.8): "
;;
*CCACHE_LOGFILE*)
_path_files -g "*(/) *.log"
;;
*CCACHE_MAXFILES*)
_alternative ":maximum number of files in the cache (0= no limit): "
;;
*CCACHE_MAXSIZE*)
_alternative ':maximum cache size (0= no limit) with suffix k,M,G,T or Ki,Mi,Gi,Ti: '
;;
*CCACHE_PATH*)
_alternative ':PATH for compiler lookup (instead of $PATH):_dir_list'
;;
*CCACHE_PREFIX*)
_alternative ':prefixes for compiler invokation: '
;;
*CCACHE_PREFIX_CPP*)
_alternative ':prefixes for preprocessor invokation: '
;;
*CCACHE_*READONLY*)
_ccache_booleans "treat cache as read-only"
;;
*CCACHE_*READONLY_DIRECT*)
_ccache_booleans "retrieve from read-only cache in direct mode"
;;
*CCACHE_*RECACHE*)
_ccache_booleans "use cache in write-only mode"
;;
*CCACHE_*CPP2*)
_ccache_booleans "pass originial rather than preprocessed source code to compiler"
;;
*CCACHE_SLOPPINESS*)
_ccache_sloppiness
;;
*CCACHE_*STATS*)
_ccache_booleans "update statistics counters"
;;
*CCACHE_UMASK*)
_alternative ":umask value (octal): "
;;
*CCACHE_*UNIFY*)
_ccache_booleans "normalise sources prior to processing"
;;
esac
return
fi
__ccache_config_keys() {
local -a keys
keys=(
'compression:write compressed cache'
'direct_mode:use direct mode'
'disable:disable cache usage'
'hard_link:create hard links rather than copies'
'hash_dir:include the cwd in the hash'
'keep_comments_cpp:consider comments in hashing'
'read_only:treat cache as read-only'
'read_only_direct:retrieve from read-only cache in direct mode'
'recache:use cache in write-only mode'
'run_second_cpp:pass originial rather than preprocessed source code to compiler'
'stats:update statistics counters'
'unify:normalise sources prior to processing'
'base_dir:specify relative to which top level paths are hashed'
'temporary_dir:specify where temporary files are kept'
'cache_dir:specify where the cache is kept'
'compiler:specify compiler'
'cache_dir_levels:directory levels in the cache directory'
'compiler_check:compiler information included in the hash'
'compression_level:cache compression level'
'cpp_extension:set extensions for intermediate files'
'extra_files_to_hash:additional files to consider in hashing'
'ignore_headers_in_manifest:set paths to headers to ignore in hashing'
'limit_multiple:cleanup level'
'log_file:specify a log file'
'max_files:maximum number of files in the cache'
'max_size:maximum size of the cache'
'path:PATH for compiler lookup (instead of $PATH)'
'prefix_command:prefixes for compiler invokation'
'prefix_command_cpp:prefixes for preprocessor invokation'
'sloppiness:hash files sloppy'
'umask:set umask for ccache and child processes (e.g. for sharing cache)'
)
_describe -t configkeys "configuration keys" keys -S '='
}
if compset -P '--set-config=*='; then
case $IPREFIX in
*=compression= | *=direct_mode= | *=disable= | *=hard_link= | *=hash_dir= | *=keep_comments_cpp= | *=read_only= | *=read_only_direct= | *=recache= | *=run_second_cpp= | *=stats= | *=unify= )
local booleans; booleans=(
'true'
'false'
)
_describe -t booleans 'boolean' booleans
;;
*=base_dir= | *=temporary_dir= | *=cache_dir=)
_path_files -/
;;
*=compiler=)
_ccache_compilers
;;
*=cache_dir_levels=)
_ccache_dirlevels
;;
*=compiler_check=)
_ccache_compilerchecks
;;
*=compression_level=)
_ccache_compressionlevels
;;
*=cpp_extension=)
_alternative ':set extension for intermediate files: '
;;
*=extra_files_to_hash=)
local sep=:
compset -P "*${sep}"
compset -S "${sep}*" || suf="$sep"
_files "" -r "${sep}"' /\t\t\-' "$@"
;;
*=ignore_headers_in_manifest=)
_dir_list
;;
*=limit_multiple=)
_alternative ":clean up down to level (e.g. 0.8): "
;;
*=log_file=)
_path_files -g "*(/) *.log"
;;
*=max_files=)
_alternative ":maximum number of files in the cache (0= no limit): "
;;
*=max_size=)
_alternative ':maximum cache size (0= no limit) with suffix k,M,G,T or Ki,Mi,Gi,Ti: '
;;
*=path=)
_alternative ':PATH for compiler lookup (instead of $PATH):_dir_list'
;;
*=prefix_command=)
_alternative ':prefixes for compiler invokation: '
;;
*=prefix_command_cpp=)
_alternative ':prefixes for preprocessor invokation: '
;;
*=sloppiness=)
_ccache_sloppiness
;;
*=umask=)
_alternative ":umask value (octal): "
;;
esac
elif [[ $words[2] == -* ]]; then
# if the first argument starts with -, we are in configure-ccache mode
_arguments \
'*'{-o,--set-config=}"[set configuration key]:keys:__ccache_config_keys" \
'(: -)'{-h,--help}'[show help message]' \
'(: -)'{-V,--version}'[print version and copyright information]' \
'(-z --zero-stats)'{-z,--zero-stats}'[zero statistics counters]' \
'(-c --cleanup)'{-c,--cleanup}'[delete old files and recalculate size counters]' \
'(-C --clear)'{-C,--clear}'[clear the cache completely (except configuration)]' \
'(-p --print-config)'{-p,--print-config}'[print current configuration options]' \
'(-s --show-stats)'{-s,--show-stats}'[show statistics summary]' \
'(-F --max-files=)'{-F,--max-files=}'[set maximum number of files in cache]:number of files in cache: ' \
'(-M --max-size=)'{-M,--max-size=}'[set maximum size of cache]:cache size: '
elif [[ $CURRENT -eq 2 ]]; then
_ccache_compilers
else
# the command line already looks like 'ccache <compiler> ...'
# forward to the completion function of the compiler
(( CURRENT-- ))
shift words
_normal
fi

View File

@@ -0,0 +1,211 @@
#compdef chromium
# Copyright 2018 CERN for the benefit of the LHCb Collaboration
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following disclaimer
# in the documentation and/or other materials provided with the
# distribution.
# * Neither the name of CERN nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
# In applying this licence, CERN does not waive the privileges and immunities
# granted to it by virtue of its status as an Intergovernmental Organization
# or submit itself to any jurisdiction.
_arguments \
"--user-data-dir=[Specify the directory that user data is kept in]:directory:_path_files -/" \
"--app=[Runs URL in app mode]:url:_urls" \
"--incognito[Open in incognito mode]" \
"--new-window[open in new window]" \
"(--no-proxy-server --proxy-auto-detect --proxy-pac-url --password-store)--proxy-server=[specify proxy server]:[<proxy-scheme>\://]<proxy-host>[\:<proxy-port>]:_chromium_proxyurls" \
"--no-proxy-server[Disables the proxy server]" \
"--proxy-auto-detect[Autodetect proxy configuration]" \
"--proxy-pac-url=[Specify proxy autoconfiguration URL]:proxy autoconfiguration url:_urls" \
"--password-store=[Set the password store to use]:password store: _wanted arguments expl 'wallet store' compadd -- basic gnome kwallet" \
"--version[print version]" \
"*:: :{ _alternative _urls _files }"
# excerpt from the chromium help message:
#
# Specify the HTTP/SOCKS4/SOCKS5 proxy server to use for requests. This overrides any environment variables or settings picked via the options dialog. An individual proxy server is specified
# using the format:
#
#
#
# Where <proxy-scheme> is the protocol of the proxy server, and is one of:
#
# "http", "socks", "socks4", "socks5".
#
# If the <proxy-scheme> is omitted, it defaults to "http". Also note that "socks" is equivalent to "socks5".
#
# Examples:
#
# --proxy-server="foopy:99"
# Use the HTTP proxy "foopy:99" to load all URLs.
#
# --proxy-server="socks://foobar:1080"
# Use the SOCKS v5 proxy "foobar:1080" to load all URLs.
#
# --proxy-server="socks4://foobar:1080"
# Use the SOCKS v4 proxy "foobar:1080" to load all URLs.
#
# --proxy-server="socks5://foobar:66"
# Use the SOCKS v5 proxy "foobar:66" to load all URLs.
#
# It is also possible to specify a separate proxy server for different URL types, by prefixing the proxy server specifier with a URL specifier:
#
# Example:
#
# --proxy-server="https=proxy1:80;http=socks4://baz:1080"
# Load https://* URLs using the HTTP proxy "proxy1:80". And load http://*
# URLs using the SOCKS v4 proxy "baz:1080".
#
_chromium_proxyurls () {
#TODO: semicolon separated urls not yet implemented
# mostly copied from _urls
local ipre scheme host user uhosts ret=1 expl match glob suf
local localhttp
zstyle -a ":completion:${curcontext}:urls" local localhttp
local localhttp_servername="$localhttp[1]"
local localhttp_documentroot="$localhttp[2]"
local localhttp_userdir="$localhttp[3]"
zstyle -a ":completion:${curcontext}:urls" urls urls
if [[ $#urls -gt 1 || ( $#urls -eq 1 && ! -d $urls[1] ) ]]
then
[[ $#urls -eq 1 && -f $urls[1] ]] && urls=($(< $urls[1]))
_wanted urls expl 'URL' compadd "$@" -a urls && return 0
urls=()
fi
urls="$urls[1]"
glob=(-g '*(^/)')
zparseopts -D -K -E 'g:=glob'
ipre="$IPREFIX"
if ! compset -P '(#b)([-+.a-z0-9]#):'
then
_tags -C argument prefixes
while _tags
do
while _next_label prefixes expl 'URL prefix' -S '' "$@"
do
compset -S '[^:/]*' && compstate[to_end]=''
compadd "$expl[@]" http:// socks:// socks4:// socks5:// && ret=0
done
(( ret )) || return 0
done
return 1
fi
scheme="$match[1]"
case "$scheme" in
(http(|s)|socks(|4|5)) if ! compset -P //
then
_wanted -C "$scheme" prefixes expl 'end of prefix' compadd -S '' "$@" //
return
fi ;;
(file) [[ -prefix //(127.0.0.1|localhost)/ ]] && compset -P '//(127.0.0.1|localhost)'
[[ -prefix /// ]] && compset -P //
if ! compset -P //
then
_tags -C file files
while _tags
do
while _next_label files expl 'local file'
do
if [[ -prefix / ]]
then
_path_files "$expl[@]" -S '' "${glob[@]}" && ret=0
_path_files "$expl[@]" -S/ -r '/' -/ && ret=0
elif [[ -z "$PREFIX" ]]
then
compadd -S '/' -r '/' "$expl[@]" "$@" - "${PWD%/}" && ret=0
fi
done
(( ret )) || return 0
done
return 1
fi ;;
esac
if ! compset -P '(#b)([^:/]#)([:/])'
then
uhosts=($urls/$scheme/$PREFIX*$SUFFIX(/:t))
_tags hosts
while _tags
do
while _next_label hosts expl host
do
compset -S '[:/]*' || suf="/"
(( $#uhosts )) || _hosts -S "$suf" -r '/:' "$expl[@]" && ret=0
[[ "$scheme" = http ]] && uhosts=($uhosts $localhttp_servername)
compadd -S "$suf" -r '/:' "$expl[@]" -a uhosts && ret=0
done
(( ret )) || return 0
done
return 1
fi
host="$match[1]"
[[ $match[2] = ':' ]] && ! compset -P '<->/' && _message -e ports 'port number' && return 0
_tags remote-files files || return 1
if [[ "$localhttp_servername" = "$host" ]]
then
if compset -P \~
then
if ! compset -P '(#b)([^/]#)/'
then
_users -S/ "$@"
return
fi
user="$match[1]"
while _tags
do
while _next_label files expl 'local file'
do
_path_files "$expl[@]" "$@" -W ~$user/$localhttp_userdir "${glob[@]}" && ret=0
_path_files -S/ -r '/' "$expl[@]" -W ~$user/$localhttp_userdir-/ && ret=0
done
(( ret )) || return 0
done
else
while _tags
do
while _next_label files expl 'local file'
do
_path_files "$expl[@]" "$@" -W $localhttp_documentroot "${glob[@]}" && ret=0
_path_files -S/ -r '/' "$expl[@]" -W $localhttp_documentroot -/ && ret=0
done
(( ret )) || return 0
done
fi
else
while _tags
do
(( $#urls )) && while _next_label files expl 'local file'
do
_path_files "$expl[@]" "$@" -W $urls/$scheme/$host "${glob[@]}" && ret=0
_path_files -S/ -r '/' "$expl[@]" -W $urls/$scheme/$host -/ && ret=0
done
[[ $scheme = (scp|sftp) ]] && _requested remote-files && _remote_files -h $host -- ssh && ret=0
(( ret )) || return 0
done
fi
return $ret
}

View File

@@ -1,6 +1,6 @@
#compdef cmake #compdef cmake
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# Copyright (c) 2016 Github zsh-users - http://github.com/zsh-users # Copyright (c) 2017 Github zsh-users - http://github.com/zsh-users
# All rights reserved. # All rights reserved.
# #
# Redistribution and use in source and binary forms, with or without # Redistribution and use in source and binary forms, with or without
@@ -35,7 +35,7 @@
# ------- # -------
# #
# * Scott M. Kroll <skroll@gmail.com> (initial version) # * Scott M. Kroll <skroll@gmail.com> (initial version)
# * Paul Seyfert <pseyfert@mathphys.fsk.uni-heidelberg.de> (handling of --build) # * Paul Seyfert <pseyfert.mathphys@gmail.com> (handling of --build and updates)
# #
# ------------------------------------------------------------------------- # -------------------------------------------------------------------------
# Notes # Notes
@@ -78,9 +78,15 @@ local cmake_build_options;cmake_build_options=(
# ------------------------ # ------------------------
# _cmake_generator_options # _cmake_generator_options
#
# arguments are $1: build working directory (top level Makefile or build.ninja file)
# $2: position of "--" in the command line
# ------------------------ # ------------------------
(( $+functions[_cmake_generator_options] )) || (( $+functions[_cmake_generator_options] )) ||
_cmake_generator_options() { _cmake_generator_options() {
# pass only the part of the command line starting at "--" to the completion
shift (( $2 - 1 )) words
(( CURRENT = $CURRENT + 1 - $2 ))
if [ -f $1/Makefile ] if [ -f $1/Makefile ]
then then
$_comps[make] $_comps[make]
@@ -116,6 +122,10 @@ _cmake_targets() {
_describe 'build targets' targets _describe 'build targets' targets
} }
_cmake_suggest_builddirs() {
_alternative ':current directory:(.)' 'directory::_directories' && return 0
}
_cmake_on_build() { _cmake_on_build() {
local build_extras;build_extras=( local build_extras;build_extras=(
'--[Native build tool options]' '--[Native build tool options]'
@@ -130,14 +140,14 @@ _cmake_on_build() {
(( i++ )) (( i++ ))
done done
inbuild=false inbuild=false
nativemode=false dashdashposition=-1
for ((i = (($CURRENT - 1)); i > 1 ; i--)); do for ((i = (($CURRENT - 1)); i > 1 ; i--)); do
if [[ $words[$i] == --build ]] ; then if [[ $words[$i] == --build ]] ; then
inbuild=true inbuild=true
buildat=$i buildat=$i
(( difference = $CURRENT - $i )) (( difference = $CURRENT - $i ))
elif [[ $words[$i] == -- ]] ; then elif [[ $words[$i] == -- ]] ; then
nativemode=true dashdashposition=$i
fi fi
done done
# check if build mode has been left # check if build mode has been left
@@ -149,18 +159,16 @@ _cmake_on_build() {
if [[ $words[(($i - 1))] == --config ]] ; then continue ; fi if [[ $words[(($i - 1))] == --config ]] ; then continue ; fi
outofbuild=true outofbuild=true
done done
if [ "$nativemode" = true ] ; then if (( $dashdashposition > 0 )) ; then
_cmake_generator_options $words[(($buildat + 1))] && return 0 _cmake_generator_options $words[(($buildat + 1))] $dashdashposition && return 0
fi fi
if [ "$inbuild" = false ] ; then if [[ "$inbuild" == false || "$difference" -eq 1 ]] ; then
# either there is no --build or completing the directory after --build
_arguments -C -s \ _arguments -C -s \
- build_opts \ - build_opts \
"$cmake_build_options[@]" \ "$cmake_build_options[@]" \
- build_cmds \ - build_cmds \
"$cmake_suggest_build[@]" && return 0 "$cmake_suggest_build[@]" && return 0
elif [ $difference -eq 1 ] ; then
# directly after --build comes the build directory
_alternative ':current directory:(.)' 'directory::_directories' && return 0
elif [[ $words[(($CURRENT - 1))] == --target ]] ; then elif [[ $words[(($CURRENT - 1))] == --target ]] ; then
# after --build <dir> --target, suggest targets # after --build <dir> --target, suggest targets
_cmake_targets $words[(($buildat + 1))] && return 0 _cmake_targets $words[(($buildat + 1))] && return 0
@@ -291,6 +299,9 @@ _cmake_define_lang_property_names() {
"CMAKE_${cmake_lang}_FLAGS_RELEASE:${cmake_lang_desc} compiler flags for all Relase build" "CMAKE_${cmake_lang}_FLAGS_RELEASE:${cmake_lang_desc} compiler flags for all Relase build"
"CMAKE_${cmake_lang}_FLAGS_MINSIZREL:${cmake_lang_desc} compiler flags for all MinSizRel build" "CMAKE_${cmake_lang}_FLAGS_MINSIZREL:${cmake_lang_desc} compiler flags for all MinSizRel build"
"CMAKE_${cmake_lang}_FLAGS_RELWITHDEBINFO:${cmake_lang_desc} compiler flags for all RelWithDebInfo build" "CMAKE_${cmake_lang}_FLAGS_RELWITHDEBINFO:${cmake_lang_desc} compiler flags for all RelWithDebInfo build"
"CMAKE_${cmake_lang}_STANDARD:${cmake_lang_desc} language standard"
"CMAKE_${cmake_lang}_STANDARD_REQUIRED:${cmake_lang_desc} language standard is required"
"CMAKE_${cmake_lang}_EXTENSIONS:${cmake_lang_desc} enable compiler specific extensions"
) )
_describe -t "${cmake_lang//:/-}-property-names" "${cmake_lang_desc} property name" properties $@[0,-3] && return 0 _describe -t "${cmake_lang//:/-}-property-names" "${cmake_lang_desc} property name" properties $@[0,-3] && return 0
@@ -302,12 +313,15 @@ _cmake_define_lang_property_names() {
(( $+functions[_cmake_define_common_property_names] )) || (( $+functions[_cmake_define_common_property_names] )) ||
_cmake_define_common_property_names() { _cmake_define_common_property_names() {
local properties; properties=( local properties; properties=(
'CMAKE_MODULE_PATH:Search path for cmake modules' 'CMAKE_MODULE_PATH:Search path for cmake modules (FindPROJECT.cmake)'
'CMAKE_PREFIX_PATH:Search path for installations (PROJECTConfig.cmake)'
'CMAKE_BUILD_TYPE:Specifies the build type for make based generators' 'CMAKE_BUILD_TYPE:Specifies the build type for make based generators'
'CMAKE_TOOLCHAIN_FILE:Absolute or relative path to a cmake script which sets up toolchain related variables' 'CMAKE_TOOLCHAIN_FILE:Absolute or relative path to a cmake script which sets up toolchain related variables'
'CMAKE_COLOR_MAKEFILE:Enables/disables color output when using the Makefile generator' 'CMAKE_COLOR_MAKEFILE:Enables/disables color output when using the Makefile generator'
'CMAKE_INSTALL_PREFIX:Install directory used by install' 'CMAKE_INSTALL_PREFIX:Install directory used by install'
'CMAKE_EXPORT_COMPILE_COMMANDS:Enable/disable output of compilation database during generation' 'CMAKE_EXPORT_COMPILE_COMMANDS:Enable/disable output of compilation database during generation'
'CMAKE_RULE_MESSAGES:Specify whether to report a message for each make rule'
'CMAKE_VERBOSE_MAKEFILE:Enable verbose output from Makefile builds'
) )
_describe -t 'common-property-names' 'common property name' properties $@ _describe -t 'common-property-names' 'common property name' properties $@
@@ -322,12 +336,18 @@ _cmake_define_property_values() {
setopt localoptions extendedglob setopt localoptions extendedglob
case $@[-1] in case $@[-1] in
(CMAKE_BUILD_TYPE) _wanted build-types expl 'build type' _cmake_build_types && ret=0;; (CMAKE_BUILD_TYPE) _wanted build-types expl 'build type' _cmake_build_types && ret=0;;
(CMAKE_CXX_STANDARD) _wanted cxx-standards expl 'cxx standard' _cmake_cxx_standars && ret=0;;
(CMAKE_C_STANDARD) _wanted c-standards expl 'c standard' _cmake_c_standars && ret=0;;
(CMAKE_TOOLCHAIN_FILE) _wanted toolchain-files expl 'file' _cmake_toolchain_files && ret=0;; (CMAKE_TOOLCHAIN_FILE) _wanted toolchain-files expl 'file' _cmake_toolchain_files && ret=0;;
(CMAKE_COLOR_MAKEFILE) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;; (CMAKE_COLOR_MAKEFILE) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;;
(CMAKE_RULE_MESSAGES) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;;
(CMAKE_VERBOSE_MAKEFILE) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;;
(CMAKE_INSTALL_PREFIX) _files -/ && ret=0;; (CMAKE_INSTALL_PREFIX) _files -/ && ret=0;;
(CMAKE_EXPORT_COMPILE_COMMANDS) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;; (CMAKE_EXPORT_COMPILE_COMMANDS) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;;
(CMAKE_*_COMPILER) _wanted compilers expl 'compiler' _cmake_compilers && ret=0;; (CMAKE_*_COMPILER) _wanted compilers expl 'compiler' _cmake_compilers && ret=0;;
(CMAKE_*_FLAGS(|_?*)) _message -e compiler-flags 'compiler flags' && ret=0;; (CMAKE_*_FLAGS(|_?*)) _message -e compiler-flags 'compiler flags' && _dispatch $service -value-,CPPFLAGS,-default- && ret=0;;
(CMAKE_*_STANDARD_REQUIRED) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;;
(CMAKE_*_EXTENSIONS) _wanted booleans expl 'boolean' _cmake_booleans && ret=0;;
(*) _files && ret=0;; (*) _files && ret=0;;
esac esac
@@ -348,6 +368,33 @@ _cmake_build_types() {
_values 'build type' ${build_types[@]} _values 'build type' ${build_types[@]}
} }
# -------------------
# _cmake_c_standars
# -------------------
(( $+functions[_cmake_c_standars] )) ||
_cmake_c_standars() {
local c_standards; c_standards=(
'90'
'99'
'11'
)
_values 'c standard' ${c_standards[@]}
}
# -------------------
# _cmake_cxx_standars
# -------------------
(( $+functions[_cmake_cxx_standars] )) ||
_cmake_cxx_standars() {
local cxx_standards; cxx_standards=(
'98'
'11'
'14'
'17'
)
_values 'cxx standard' ${cxx_standards[@]}
}
# ----------------- # -----------------
# _cmake_generators # _cmake_generators
# ----------------- # -----------------
@@ -405,7 +452,7 @@ _cmake_command() {
} }
local cmake_suggest_build;cmake_suggest_build=( local cmake_suggest_build;cmake_suggest_build=(
'--build[build]' '--build[build]:build dir:_cmake_suggest_builddirs'
) )
if [ $CURRENT -eq 2 ] ; then if [ $CURRENT -eq 2 ] ; then

View File

@@ -0,0 +1,626 @@
#compdef conan
# ------------------------------------------------------------------------------
# Copyright (c) 2010-2017 Github zsh-users - http://github.com/zsh-users
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for conan 0.28.1 (https://www.conan.io).
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Julien Nicoulaud <julien.nicoulaud@gmail.com>
#
# ------------------------------------------------------------------------------
_conan() {
local context state state_descr line
typeset -A opt_args
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(- : *)'{-v,--version}'[display version information]' \
'(-h --help)1: :_conan_commands' \
'(-h --help)*:: :->command_args'
case $state in
command_args)
(( $+functions[_conan_${words[1]}_args] )) && _conan_${words[1]}_args
;;
esac
}
(( $+functions[_conan_commands] )) ||
_conan_commands() {
local consumer_commands creator_commands package_development_commands misc_commands deprecated_commands
consumer_commands=(
'install:installs the requirements specified in a "conanfile.py" or "conanfile.txt"'
'config:manages conan configuration information'
'get:gets a file or list a directory of a given reference or package'
'info:prints information about a package recipe'\''s dependency graph'
'search:search package recipes and binaries in the local cache or in a remote server'
)
creator_commands=(
'new:creates a new package recipe template with a '\''conanfile.py'\'''
'create:export, build package and test it with a consumer project'
'upload:uploads a package recipe and the generated binary packages to a specified remote'
'export:copies the package recipe (conanfile.py and associated files) to your local cache'
'export-pkg:exports a recipe & creates a package with given files calling '\''package'\'''
'test:runs a test-folder/conanfile.py to test an existing package'
)
package_development_commands=(
'source:calls your conanfile.py "source()" method to configure the source directory'
'build:utility command to run your current project "conanfile.py" build() method'
'package:calls your conanfile.py "package" method for a specific package recipe'
)
misc_commands=(
'profile:list profiles in the ".conan/profiles" folder, or show profile details'
'remote:handles the remote list and the package recipes associated to a remote'
'user:update your cached user name (and auth token) to avoid it being requested later'
'imports:execute the "imports" stage of a conanfile.txt or a conanfile.py'
'copy:copy conan recipes and packages to another user/channel.'
'remove:remove any package recipe or binary matching a pattern'
'alias:creates and export an alias recipe'
)
_describe -t 'consumer-commands' "consumer commands" consumer_commands
_describe -t 'creator-commands' "creator commands" creator_commands
_describe -t 'package-development-commands' "package development commands" package_development_commands
_describe -t 'misc-commands' "misc commands" misc_commands
}
(( $+functions[_conan_alias_args] )) ||
_conan_alias_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1:alias reference:_conan_package_references' \
'(-h --help)2:target reference:_conan_package_references'
}
(( $+functions[_conan_build_args] )) ||
_conan_build_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
"(-h --help -f --file)"{-f,--file}'[specify conanfile filename]: :_conan_conanfiles' \
"(-h --help -sf --source-folder)"{-sf,--source-folder}'[local folder containing the sources. Defaulted to the directory of the conanfile. A relative path can also be specified (relative to the current directory)]: :_files -/' \
"(-h --help -bf --build-folder)"{-bf,--build-folder}'[build folder, working directory of the build process. Defaulted to the current directory. A relative path can also be specified (relative to the current directory)]: :_files -/' \
"(-h --help -pf --package-folder)"{-pf,--package-folder}'[folder to install the package (when the build system or build() method does it). Defaulted to the '\''{build_folder}/package'\'' folder. A relative path can be specified, relative to the current folder. Also an absolute path is allowed.]: :_files -/' \
"(-h --help -if --install-folder)"{-if,--install-folder}'[local folder containing the conaninfo.txt and conanbuildinfo.txt files (from a previous conan install execution). Defaulted to --build-folder]: :_files -/' \
'(-h --help)1: :_conan_conanfiles'
}
(( $+functions[_conan_config_args] )) ||
_conan_config_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_config_commands' \
'(-h --help)*:: :->command_args'
case $state in
command_args)
(( $+functions[_conan_config_${words[1]}_args] )) && _conan_config_${words[1]}_args
;;
esac
}
(( $+functions[_conan_config_commands] )) ||
_conan_config_commands() {
local commands
commands=(
'rm:rm an existing config element'
'set:set/add value'
'get:get the value of existing element'
'install:install a full configuration from a zip file, local or remote'
)
_describe -t 'commands' "command" commands
}
(( $+functions[_conan_config_rm_args] )) ||
_conan_config_rm_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_config_keys'
}
(( $+functions[_conan_config_get_args] )) ||
_conan_config_get_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_config_keys'
}
(( $+functions[_conan_config_set_args] )) ||
_conan_config_set_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_config_set_key_values'
}
(( $+functions[_conan_config_set_key_values] )) ||
_conan_config_set_key_values() {
local ret=1
if compset -P '*='; then
_wanted property-values expl 'config value' _conan_config_values ${IPREFIX%=} && ret=0
else
_wanted property-names expl 'config key' _conan_config_keys -qS= && ret=0
fi
return ret
}
(( $+functions[_conan_config_install_args] )) ||
_conan_config_install_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1:config file:_files'
}
(( $+functions[_conan_copy_args] )) ||
_conan_copy_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
"(-h --help --all -p --package)"{-p,--package}'[copy specified package ID]:package reference:_conan_package_references' \
'(-h --help --all -p --package)--all[copy all packages from the specified package recipe]' \
'(-h --help --force)--force[override destination packages and the package recipe]' \
'(-h --help)1: :_conan_package_references' \
'(-h --help)2: :_conan_user_channels'
}
(( $+functions[_conan_export_args] )) ||
_conan_export_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help -p --path)'{-p,--path}'[folder with a conanfile.py (default current directory)]: :_files -/' \
'(-h --help -k --keep-source)'{-k,--keep-source}'[do not remove the source folder in the local cache]' \
'(-h --help -f --file)'{-f,--file}'[specify conanfile filename]: :_conan_conanfiles' \
'(-h --help)1: :_conan_channel_or_package_references'
}
(( $+functions[_conan_get_args] )) ||
_conan_get_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
"(-h --help -p --package)"{-p,--package}'[package ID]: :_conan_package_references' \
'(-h --help -r --remote)'{-r,--remote}'[get from this specific remote]: :_conan_remotes' \
'(-h --help -raw --raw)'{-raw,--raw}'[do not decorate the text]' \
'(-h --help)1: :_conan_package_references' \
'(-h --help)2:file or directory path:_files'
}
(( $+functions[_conan_imports_args] )) ||
_conan_imports_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help -f --file)'{-f,--file}'[use another filename]: :_conan_conanfiles' \
'(-h --help -imf --import-folder)'{-imf,--import-folder}'[directory to copy the artifacts to. By default it will be the current directory]: :_files -/' \
'(-h --help -if --install-folder)'{-if,--install-folder}'[local folder containing the conaninfo.txt and conanbuildinfo.txt files (from a previous conan install execution)]: :_files -/' \
'(-h --help -u --undo)'{-u,--undo}'[undo imports (remove imported files)]' \
'(-h --help)1: :_conan_directory_or_package_references'
}
(( $+functions[_conan_info_args] )) ||
_conan_info_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help -f --file)'{-f,--file}'[specify conanfile filename]: :_conan_conanfiles' \
'(-h --help -n --only)'{-n,--only}'[filter fields]: :_conan_info_only_values' \
'(-h --help --paths)--paths[show package paths in local cache]' \
'(-h --help --package-filter)--package-filter[print information only for packages that match the filtere.g., MyPackage/1.2@user/channel or MyPackage*]: :_conan_package_references' \
'(-h --help -bo --build_order)'{-bo,--build_order}'[given a modified reference, return an ordered list to build (CI)]' \
'(-h --help -j --json)'{-j,--json}'[only with --build_order option, return the information in a json]: :_files -g "*.json"' \
'(-h --help -g --graph)'{-g,--graph}'[creates file with project dependencies graph]: :_files -g "*.(dot|html)"' \
'(-h --help -u --update)'{-u,--update}'[check updates exist from upstream remotes]' \
'(-h --help -sc --scope)'{-sc,--scope}'[use the specified scope in the install command]: :_conan_scopes' \
'(-h --help -pr --profile)'{-pr,--profile}'[apply the specified profile to the install command]: :_conan_profiles' \
'(-h --help -r --remote)'{-r,--remote}'[look in the specified remote server]: :_conan_remotes' \
'(-h --help)'{-o,--options}'[options to build the package, overwriting the defaults. e.g., -o with_qt=true]: :_conan_options' \
'(-h --help)'{-s,--settings}'[settings to build the package, overwriting the defaults. e.g., -s compiler=gcc]: :_conan_settings' \
'(-h --help)'{-e,--env}'[environment variables that will be set during the package build, -e CXX=/usr/bin/clang++]: :_conan_environment_variables' \
'(-h --help -b --build)'{-b,--build}'[given a build policy (same install command "build" parameter), return an ordered list of packages that would be built from sources in install command (simulation)]: :_conan_build_policies' \
'(-h --help)1: :_conan_conanfile_or_package_references'
}
(( $+functions[_conan_info_only_values] )) ||
_conan_info_only_values() {
local values
values=(
'id:show only "id"'
'build_id:show only "build_id"'
'remote:show only "remote"'
'url:show only "url"'
'license:show only "license"'
'requires:show only "requires"'
'update:show only "update"'
'required:show only "required"'
'date:show only "date"'
'author:show only "author"'
'export_folder:use with --paths'
'build_folder:use with --paths'
'package_folder:use with --paths'
'source_folder:use with --paths'
'None:show only references'
)
_describe -t 'values' "value" values
}
(( $+functions[_conan_install_args] )) ||
_conan_install_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help -f --file)'{-f,--file}'[specify conanfile filename]: :_conan_conanfiles' \
'(-h --help -g --generator)'{-g,--generator}'[generators to use]: :_conan_generators' \
'(-h --help --werror)--werror[error instead of warnings for graph inconsistencies]' \
'(-h --help -if --install-folder)'{-if,--install-folder}'[Use this directory as the directory where to put the generatorfiles, conaninfo/conanbuildinfo.txt etc.]: :_files -/' \
'(-h --help -m --manifests)'{-m,--manifests}'[install dependencies manifests in folder for later verify]: :_files -/' \
'(-h --help -mi --manifests-interactive)'{-mi,--manifests-interactive}'[install dependencies dependencies manifests in folder for later verify]: :_files -/' \
'(-h --help -v --verify)'{-v,--verify}'[verify dependencies manifests against stored ones]: :_files -/' \
'(-h --help --no-imports)--no-imports[install specified packages but avoid running imports]' \
'(-h --help -u --update)'{-u,--update}'[check updates exist from upstream remotes]' \
'(-h --help -sc --scope)'{-sc,--scope}'[use the specified scope in the install command]: :_conan_scopes' \
'(-h --help -pr --profile)'{-pr,--profile}'[apply the specified profile to the install command]: :_conan_profiles' \
'(-h --help -r --remote)'{-r,--remote}'[look in the specified remote server]: :_conan_remotes' \
'(-h --help)'{-o,--options}'[options to build the package, overwriting the defaults. e.g., -o with_qt=true]: :_conan_options' \
'(-h --help)'{-s,--settings}'[settings to build the package, overwriting the defaults. e.g., -s compiler=gcc]: :_conan_settings' \
'(-h --help)'{-e,--env}'[environment variables that will be set during the package build, -e CXX=/usr/bin/clang++]: :_conan_environment_variables' \
'(-h --help -b --build)'{-b,--build}'[given a build policy (same install command "build" parameter), return an ordered list of packages that would be built from sources in install command (simulation)]: :_conan_build_policies' \
'(-h --help)1: :_conan_conanfile'
}
(( $+functions[_conan_new_args] )) ||
_conan_new_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help -t --test)'{-t,--test}'[create test_package skeleton to test package]' \
'(-h --help -i --header)'{-i,--header}'[create a headers only package template]' \
'(-h --help -c --pure_c)'{-c,--pure_c}'[create a C language package only package, deleting "self.settings.compiler.libcxx" setting in the configure method]' \
'(-h --help -s --sources)'{-s,--sources}'[create a package with embedded sources in "src" folder, using "exports_sources" instead of retrieving external code with the "source()" method]' \
'(-h --help -b --bare)'{-b,--bare}'[create the minimum package recipe, without build() or package() methods. Useful in combination with "package_files" command]' \
'(-h --help -cis --ci_shared)'{-cis,--ci_shared}'[package will have a "shared" option to be used in CI]' \
'(-h --help -cilg --ci_travis_gcc)'{-cilg,--ci_travis_gcc}'[generate travis-ci files for linux gcc]' \
'(-h --help -cilc --ci_travis_clang)'{-cilc,--ci_travis_clang}'[generate travis-ci files for linux clang]' \
'(-h --help -cilg --ci_travis_gcc)'{-cilg,--ci_travis_gcc}'[generate travis-ci files for linux gcc]' \
'(-h --help -cio --ci_travis_osx)'{-cio,--ci_travis_osx}'[generate travis-ci files for OSX apple-clang]' \
'(-h --help -ciw --ci_appveyor_win)'{-ciw,--ci_appveyor_win}'[generate appveyor files for Appveyor Visual Studio]' \
'(-h --help -ciglg --ci_gitlab_gcc)'{-ciglg,--ci_gitlab_gcc}'[generate GitLab files for linux gcc]' \
'(-h --help -ciglc --ci_gitlab_clang)'{-ciglc,--ci_gitlab_clang}'[generate GitLab files for linux clang]' \
'(-h --help -cilg --ci_travis_gcc)'{-cilg,--ci_travis_gcc}'[generate travis-ci files for linux gcc]' \
'(-h --help -gi --gitignore)'{-gi,--gitignore}'[generate a .gitignore with the known patterns to excluded]' \
'(-h --help -ciu --ci_upload_url)'{-ciu,--ci_upload_url}'[define URL of the repository to upload]: :_urls' \
'(-h --help)1: :_conan_package_references'
}
(( $+functions[_conan_package_args] )) ||
_conan_package_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
"(-h --help -sf --source-folder)"{-sf,--source-folder}'[local folder containing the sources. Defaulted to the directory of the conanfile. A relative path can also be specified (relative to the current directory)]: :_files -/' \
"(-h --help -bf --build-folder)"{-bf,--build-folder}'[build folder, working directory of the build process. Defaulted to the current directory. A relative path can also be specified (relative to the current directory)]: :_files -/' \
"(-h --help -pf --package-folder)"{-pf,--package-folder}'[folder to install the package (when the build system or build() method does it). Defaulted to the '\''{build_folder}/package'\'' folder. A relative path can be specified, relative to the current folder. Also an absolute path is allowed.]: :_files -/' \
"(-h --help -if --install-folder)"{-if,--install-folder}'[local folder containing the conaninfo.txt and conanbuildinfo.txt files (from a previous conan install execution). Defaulted to --build-folder]: :_files -/' \
'(-h --help)1: :_conan_package_references' \
'(-h --help)2:package ID:'
}
(( $+functions[_conan_profile_args] )) ||
_conan_profile_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_profile_commands' \
'(-h --help)*:: :->command_args'
case $state in
command_args)
(( $+functions[_conan_profile_${words[1]}_args] )) && _conan_profile_${words[1]}_args
;;
esac
}
(( $+functions[_conan_profile_commands] )) ||
_conan_profile_commands() {
local commands
commands=(
'list:list current profiles'
'show:show the values defined for a profile'
'new:creates a new empty profile'
'update:update a profile'
'get:get a profile key'
'remove:remove a profile key'
)
_describe -t 'commands' "command" commands
}
(( $+functions[_conan_profile_list_args] )) ||
_conan_profile_list_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]'
}
(( $+functions[_conan_profile_show_args] )) ||
_conan_profile_show_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_profiles'
}
(( $+functions[_conan_profile_new_args] )) ||
_conan_profile_new_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)--detect[autodetect settings and fill \[settings\] section]' \
'(-h --help)1:profile name:'
}
(( $+functions[_conan_profile_update_args] )) ||
_conan_profile_update_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_settings' \
'(-h --help)2: :_conan_profiles'
}
(( $+functions[_conan_profile_get_args] )) ||
_conan_profile_get_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_setting_keys' \
'(-h --help)2: :_conan_profiles'
}
(( $+functions[_conan_profile_remove_args] )) ||
_conan_profile_remove_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_setting_keys' \
'(-h --help)2: :_conan_profiles'
}
(( $+functions[_conan_remote_args] )) ||
_conan_remote_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_remote_commands' \
'(-h --help)*:: :->command_args'
case $state in
command_args)
(( $+functions[_conan_remote_${words[1]}_args] )) && _conan_remote_${words[1]}_args
;;
esac
}
(( $+functions[_conan_remote_commands] )) ||
_conan_remote_commands() {
local commands
commands=(
'list:list current remotes'
'add:add a remote'
'remove:remove a remote'
'update:update the remote url'
'list_ref:list the package recipes and its associated remotes'
'add_ref:associate a recipe'\''s reference to a remote'
'remove_ref:dissociate a recipe'\''s reference and its remote'
'update_ref:update the remote associated with a package recipe'
)
_describe -t 'commands' "command" commands
}
(( $+functions[_conan_remote_list_args] )) ||
_conan_remote_list_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]'
}
(( $+functions[_conan_remote_add_args] )) ||
_conan_remote_add_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1:name of the remote:' \
'(-h --help)2:url of the remote:_urls' \
'(-h --help)3:verify SSL certificated:(True False)'
}
(( $+functions[_conan_remote_remove_args] )) ||
_conan_remote_remove_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_remotes'
}
(( $+functions[_conan_remote_update_args] )) ||
_conan_remote_update_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]' \
'(-h --help)1: :_conan_remotes' \
'(-h --help)2:url of the remote:_urls' \
'(-h --help)3:verify SSL certificated:(True False)'
}
(( $+functions[_conan_remote_list_ref_args] )) ||
_conan_remote_list_ref_args() {
_arguments -C \
'(- : *)'{-h,--help}'[display help information]'
}
# TODO complete conan remote add_ref
# TODO complete conan remote remove_ref
# TODO complete conan remote update_ref
# TODO complete conan remove
# TODO complete conan search
# TODO complete conan source
# TODO complete conan upload
# TODO complete conan user
# TODO complete conan export-pkg
# TODO complete conan test
(( $+functions[_conan_conanfiles] )) ||
_conan_conanfiles() {
_files -g '*.py'
}
(( $+functions[_conan_directory_or_package_references] )) ||
_conan_conanfile_or_package_references() {
_alternative \
'conanfile: :_conan_conanfiles' \
'package-references: :_conan_package_references'
}
(( $+functions[_conan_directory_or_package_references] )) ||
_conan_directory_or_package_references() {
_alternative \
'directory: :_files -/' \
'package-references: :_conan_package_references'
}
(( $+functions[_conan_channel_or_package_references] )) ||
_conan_channel_or_package_references() {
_alternative \
'package-references: :_conan_package_references' \
'user-channels: :_conan_user_channels'
}
(( $+functions[_conan_package_references] )) ||
_conan_package_references() {
_guard '[^\-]#' 'package reference' # TODO complete package references
}
(( $+functions[_conan_user_channels] )) ||
_conan_user_channels() {
_guard '[^\-]#' 'user channel' # TODO complete user channels
}
(( $+functions[_conan_remotes] )) ||
_conan_remotes() {
local remotes; remotes=(${(f)"$(_call_program remotes $service remote list)"})
_describe -t remotes 'remote' remotes "$@"
}
(( $+functions[_conan_scopes] )) ||
_conan_scopes() {
_guard '[^\-]#' 'scope' # TODO complete scopes
}
(( $+functions[_conan_build_policies] )) ||
_conan_build_policies() {
_guard '[^\-]#' 'build policy' # TODO complete build policies
}
(( $+functions[_conan_generators] )) ||
_conan_generators() {
_guard '[^\-]#' 'generator' # TODO complete generators
}
(( $+functions[_conan_profiles] )) ||
_conan_profiles() {
local profiles; profiles=(${(f)"$(_call_program profiles $service profile list)"})
_describe -t profiles 'profile' profiles "$@"
}
(( $+functions[_conan_config_keys] )) ||
_conan_config_keys() {
_guard '[^\-]#' 'config key' # TODO complete config keys
}
(( $+functions[_conan_config_values] )) ||
_conan_config_values() {
_guard '[^\-]#' 'config value' # TODO complete config values
}
(( $+functions[_conan_options] )) ||
_conan_options() {
local ret=1
if compset -P '*='; then
_wanted option-values expl 'option value' _conan_option_values ${IPREFIX%=} && ret=0
else
_wanted option-names expl 'option key' _conan_option_keys -qS= && ret=0
fi
return ret
}
(( $+functions[_conan_option_keys] )) ||
_conan_option_keys() {
_guard '[^\-]#' 'option key' # TODO complete option keys
}
(( $+functions[_conan_option_values] )) ||
_conan_option_values() {
_guard '[^\-]#' 'option value' # TODO complete option values
}
(( $+functions[_conan_settings] )) ||
_conan_settings() {
local ret=1
if compset -P '*='; then
_wanted setting-values expl 'setting value' _conan_setting_values ${IPREFIX%=} && ret=0
else
_wanted setting-names expl 'setting key' _conan_setting_keys -qS= && ret=0
fi
return ret
}
(( $+functions[_conan_setting_keys] )) ||
_conan_setting_keys() {
_guard '[^\-]#' 'setting key' # TODO complete setting keys
}
(( $+functions[_conan_setting_values] )) ||
_conan_setting_values() {
_guard '[^\-]#' 'setting value' # TODO complete setting values
}
(( $+functions[_conan_environment_variables] )) ||
_conan_environment_variables() {
local ret=1
if compset -P '*='; then
_wanted environment_variable-values expl 'environment variable value' _conan_environment_variable_values ${IPREFIX%=} && ret=0
else
_wanted environment_variable-names expl 'environment variable' _conan_environment_variable_keys -qS= && ret=0
fi
return ret
}
(( $+functions[_conan_environment_variable_keys] )) ||
_conan_environment_variable_keys() {
_parameters -g "*export*"
}
(( $+functions[_conan_environment_variable_values] )) ||
_conan_environment_variable_values() {
_guard '[^\-]#' 'environment variable value' # TODO complete environment variable values
}
_conan "$@"
# Local Variables:
# mode: Shell-Script
# sh-indentation: 2
# indent-tabs-mode: nil
# sh-basic-offset: 2
# End:
# vim: ft=zsh sw=2 ts=2 et

View File

@@ -0,0 +1,53 @@
#compdef ecdsautil
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for ecdsaultils v0.4.0 (https://github.com/tcatm/ecdsautils)
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Robinhuett <https://github.com/Robinhuett>
#
# ------------------------------------------------------------------------------
_ecdsautil_args() {
case $words[1] in
(sign)
_arguments '1:somefile:_files'
;;
(verify)
_arguments '-s[signature]:secret:_files' '-p[publickey]:pubkey:_files'
'-n[signaturecount]:signaturecount:""' ':file:_files'
;;
esac
}
_ecdsautil() {
local -a commands
commands=(
"help:Show help"
"generate-key:generate a new secret on stdout"
"show-key:output public key of secret read from stdin"
"sign:sign file"
"verify:verify signature of file"
)
_arguments -C \
'1:cmd:->cmds' \
'*:: :->args' \
case "$state" in
(cmds)
_describe -t commands 'commands' commands
;;
(*)
_ecdsautil_args
;;
esac
}
_ecdsautil "$@"

View File

@@ -0,0 +1,442 @@
#compdef flutter
# ------------------------------------------------------------------------------
#MIT License
#
#Copyright (c) 2018 Nickolay Simonov
#
#Permission is hereby granted, free of charge, to any person obtaining a copy
#of this software and associated documentation files (the "Software"), to deal
#in the Software without restriction, including without limitation the rights
#to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
#copies of the Software, and to permit persons to whom the Software is
#furnished to do so, subject to the following conditions:
#
#The above copyright notice and this permission notice shall be included in all
#copies or substantial portions of the Software.
#
#THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
#IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
#FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
#AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
#LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
#OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
#SOFTWARE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for the Flutter.io sdk's cli tool (https://flutter.io)
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Nikolai Simonov (https://github.com/NiKoTron) <nickolay.simonov@gmail.com>
#
# ------------------------------------------------------------------------------
_flutter() {
typeset -A opt_args
local context state line
local curcontext="$curcontext"
local ret=1
_arguments -C -A "-*" \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'(-v --verbose)'{-v,--verbose}'[Noisy logging, including all shell commands executed.]' \
'--quiet[Reduce the amount of output from some commands.]' \
'(-d --device-id)'{-d,--device-id}'[Target device id or name (prefixes allowed).]' \
'--version[Reports the version of this tool.]' \
'--color[Whether to use terminal colors.]' \
'--no-color[Whether to use terminal colors.]' \
'--suppress-analytics[Suppress analytics reporting when this command runs.]' \
'--bug-report[Captures a bug report file to submit to the Flutter team (contains local paths, device identifiers, and log snippets).]' \
'--packages[Path to your ".packages" file. (required, since the current directory does not contain a ".packages" file)]' \
'--flutter-root[The root directory of the Flutter repository (uses $FLUTTER_ROOT if set).]' \
'1: :_root_commands' \
'*::arg:->args' \
&& ret=0
case "$state" in
(args)
case $words[1] in
(help)
_arguments -C \
'1: :_root_commands' \
&& ret=0
;;
(analyze)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--flutter-repo[Include all the examples and tests from the Flutter repository.]' \
'--no-flutter-repo[Include all the examples and tests from the Flutter repository.]' \
'--current-package[Include the lib/main.dart file from the current directory, if any. (defaults to on)]' \
'--no-current-package[Include the lib/main.dart file from the current directory, if any. (defaults to on)]' \
'--watch[Run analysis continuously, watching the filesystem for changes.]' \
'--preview-dart-2[Preview Dart 2.0 functionality. (defaults to on)]' \
'--no-preview-dart-2[Preview Dart 2.0 functionality. (defaults to on)]' \
'--write=[Also output the results to a file. This is useful with --watch if you want a file to always contain the latest results.]: :_files -/' \
'--pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--no-pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--congratulate[When analyzing the flutter repository, show output even when there are no errors, warnings, hints, or lints. (defaults to on)]' \
'--no-congratulate[When analyzing the flutter repository, show output even when there are no errors, warnings, hints, or lints. (defaults to on)]' \
'--preamble[When analyzing the flutter repository, display the number of files that will be analyzed. (defaults to on)]' \
'--no-preamble[When analyzing the flutter repository, display the number of files that will be analyzed. (defaults to on)]' \
&& ret=0
;;
(build)
_arguments -C \
'1: :_build_entities' \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(channel)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(clean)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(config)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--analytics[Enable or disable reporting anonymously tool usage statistics and crash reports.]' \
'--no-analytics[Enable or disable reporting anonymously tool usage statistics and crash reports.]' \
'--clear-ios-signing-cert[Clear the saved development certificate choice used to sign apps for iOS device deployment.]' \
'--gradle-dir[The gradle install directory.]' \
'--android-sdk[The Android SDK directory.]' \
'--android-studio-dir[The Android Studio install directory.]' \
&& ret=0
;;
(create)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--pub[Whether to run "flutter packages get" after the project has been created. (defaults to on)]' \
'--no-pub[Whether to run "flutter packages get" after the project has been created. (defaults to on)]' \
'--offline[When "flutter packages get" is run by the create command, this indicates whether to run it in offline mode or not. In offline mode, it will need to have all dependencies already available in the pub cache to succeed.]' \
'--no-offline[When "flutter packages get" is run by the create command, this indicates whether to run it in offline mode or not. In offline mode, it will need to have all dependencies already available in the pub cache to succeed.]' \
"--with-driver-test[Also add a fl:_root_commandsutter_driver dependency and generate a sample 'flutter drive' test.]" \
"--no-with-driver-test[Also add a flutter_driver dependency and generate a sample 'flutter drive' test.]" \
'(-t= --template=)'{-t=,--template=}'[Specify the type of project to create.]: :_project_templates' \
"--description[The description to use for your new Flutter project. This string ends up in the pubspec.yaml file. (defaults to 'A new Flutter project.')]" \
"--org[The organization responsible for your new Flutter project, in reverse domain name notation. This string is used in Java package names and as prefix in the iOS bundle identifier. (defaults to 'com.example')]" \
'(-i= --ios-language)'{-i=,--ios-language}'[iOS project language]: :_ios_languages' \
'(-a= --android-language)'{-a=,--android-language}'[Android project language]: :_droid_languages' \
&& ret=0
;;
(daemon)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(devices)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(doctor)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
"--android-licenses[Run the Android SDK manager tool to accept the SDK's licenses.]" \
&& ret=0
;;
(drive)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--debug[Build a debug version of your app (default mode).]' \
'--profile[Build a version of your app specialized for performance profiling.]' \
'--release[Build a release version of your app.]' \
'--flavor[Build a custom app flavor as defined by platform-specific build setup. Supports the use of product flavors in Android Gradle scripts. Supports the use of custom Xcode schemes.]' \
'--trace-startup[Start tracing during startup.]' \
'--route[Which route to load when running the app.]' \
'--target-platform[Specify the target platform when building the app for an Android device. Ignored on iOS.]: :_target_platforms' \
'(-t= --target=)'{-t=,-target=}'[The main entry-point file of the application, as run on the device. If the --target option is omitted, but a file name is provided on the command line, then that is used instead. (defaults to "lib/main.dart")]: :_files -g "*.dart"' \
'--observatory-port[Listen to the given port for an observatory debugger connection. Specifying port 0 will find a random free port. Defaults to the first available port after 8100.]' \
'--pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--no-pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--no-keep-app-running[Will keep the Flutter application running when done testing. By default, "flutter drive" stops the application after tests are finished, and --keep-app-running overrides this. On the other hand, if --use-existing-app is specified, then "flutter drive" instead defaults to leaving the application running, and --no-keep-app-running overrides it.]' \
'--keep-app-running[Will keep the Flutter application running when done testing. By default, "flutter drive" stops the application after tests are finished, and --keep-app-running overrides this. On the other hand, if --use-existing-app is specified, then "flutter drive" instead defaults to leaving the application running, and --no-keep-app-running overrides it.]' \
'--use-existing-app=[Connect to an already running instance via the given observatory URL. If this option is given, the application will not be automatically started, and it will only be stopped if --no-keep-app-running is explicitly set.]' \
'--driver=[The test file to run on the host (as opposed to the target file to run on the device). By default, this file has the same base name as the target file, but in the "test_driver/" directory instead, and with "_test" inserted just before the extension, so e.g. if the target is "lib/main.dart", the driver will be "test_driver/main_test.dart".]: :_files' \
'--preview-dart-2[Preview Dart 2.0 functionality. (defaults to on)]' \
'--no-preview-dart-2[Preview Dart 2.0 functionality. (defaults to on)]' \
&& ret=0
;;
(format)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(fuchsia_reload)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--debug[Build a debug version of your app (default mode).]' \
'--profile[Build a version of your app specialized for performance profiling.]' \
'--release[Build a release version of your app.]' \
'(-a --address)'{-a,--address}'[Fuchsia device network name or address.]' \
'(-b --build-dir)'{-b,--build-dir}'[Fuchsia build directory, e.g. out/release-x86-64.]' \
'(-g --gn-target)'{-g,--gn-target}'[GN target of the application, e.g //path/to/app:app.]' \
'(-i --isolate-number)'{-i,--isolate-number}'[To reload only one instance, specify the isolate number, e.g. the number in foo$main-###### given by --list.]' \
'(-l --list)'{-l,--list}'[Lists the running modules.]' \
'(-l --no-list)'{-l,--no-list}'[Lists the running modules.]' \
'(-n --name-override)'{-n,--name-override}'[On-device name of the application binary.]' \
'(-2 --preview-dart-2)'{-2,--preview-dart-2}'[Preview Dart 2.0 functionality.]' \
'(-2 --no-preview-dart-2)'{-2,--no-preview-dart-2}'[Preview Dart 2.0 functionality.]' \
'(-t --target)'{-t,--target}'[Target app path / main entry-point file. Relative to --gn-target path, e.g. lib/main.dart. (defaults to "lib/main.dart")]' \
&& ret=0
;;
(ide-config)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--overwrite[When performing operations, overwrite existing files.]' \
'--no-overwrite[When performing operations, overwrite existing files.]' \
'--update-templates[Update the templates in the template directory from the current configuration files. This is the opposite of what ide-config usually does. Will search the flutter tree for .iml files and copy any missing ones into the template directory. If --overwrite is also specified, it will update any out-of-date files, and remove any deleted files from the template directory.]' \
&& ret=0
;;
(inject-plugins)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(install)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(logs)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'(-c --clear)'{-c,--clear}'[Clear log history before reading from logs.]' \
&& ret=0
;;
(packages)
_arguments -C \
'1: :_package_subcomands' \
'*::pkg-arg:->pkg-args' \
&& ret=0
case "$state" in
(pkg-args)
case $words[1] in
(get)
_arguments \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--offline[Use cached packages instead of accessing the network.]' \
&& ret=0
;;
(pub)
_arguments \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(test)
_arguments \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(upgrade)
_arguments \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--offline[Use cached packages instead of accessing the network.]' \
&& ret=0
;;
esac
;;
esac
;;
(precache)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'(-a --all-platforms)]'{-a--all-platforms}'[Precache artifacts for all platforms.]' \
&& ret=0
;;
(run)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--debug[Build a debug version of your app (default mode).]' \
'--profile[Build a version of your app specialized for performance profiling.]' \
'--release[Build a release version of your app.]' \
'--flavor[Build a custom app flavor as defined by platform-specific build setup. Supports the use of product flavors in Android Gradle scripts. Supports the use of custom Xcode schemes.]' \
'--trace-startup[Start tracing during startup.]' \
'--route[Which route to load when running the app.]' \
'--target-platform[Specify the target platform when building the app for an Android device. Ignored on iOS.]: :_target_platforms' \
'(-t= --target=)'{-t=,--target=}'[The main entry-point file of the application, as run on the device. If the --target option is omitted, but a file name is provided on the command line, then that is used instead. (defaults to "lib/main.dart")]: :_files -g "*.dart"' \
'--observatory-port[Listen to the given port for an observatory debugger connection. Specifying port 0 will find a random free port.Defaults to the first available port after 8100.]' \
'--pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--no-pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--full-restart[Stop any currently running application process before running the app. (defaults to on)]' \
'--no-full-restart[Stop any currently running application process before running the app. (defaults to on)]' \
'--start-paused[Start in a paused mode and wait for a debugger to connect.]' \
'--enable-software-rendering[Enable rendering using the Skia software backend. This is useful when testing Flutter on emulators. By default, Flutter will attempt to either use OpenGL or Vulkan and fall back to software when neither is available.]' \
'--skia-deterministic-rendering When combined with --enable-software-rendering, provides 100% deterministic Skia rendering.]' \
'--trace-skia[Enable tracing of Skia code. This is useful when debugging the GPU thread. By default, Flutter will not log skia code.]' \
'--use-test-fonts[Enable (and default to) the "Ahem" font. This is a special font used in tests to remove any dependencies on the font metrics. It is enabled when you use "flutter test". Set this flag when running a test using "flutter run" for debugging purposes. This flag is only available when running in debug mode.]' \
'--no-use-test-fonts[Enable (and default to) the "Ahem" font. This is a special font used in tests to remove any dependencies on the font metrics. It is enabled when you use "flutter test". Set this flag when running a test using "flutter run" for debugging purposes. This flag is only available when running in debug mode.]' \
'--build[If necessary, build the app before running. (defaults to on)]' \
'--no-build[If necessary, build the app before running. (defaults to on)]' \
'--hot[Run with support for hot reloading. (defaults to on)]' \
'--no-hot[Run with support for hot reloading. (defaults to on)]' \
'--pid-file[Specify a file to write the process id to. You can send SIGUSR1 to trigger a hot reload and SIGUSR2 to trigger a full restart.]' \
&& ret=0
;;
(screenshot)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'(-o --out)'{-o,--out}'[Location to write the screenshot.]: :_files' \
'--skia=[Retrieve the last frame rendered by a Flutter app as a Skia picture using the specified observatory port. To find the observatory port number, use "flutter run --verbose" and look for "Forwarded host port ... for Observatory" in the output.]' \
&& ret=0
;;
(stop)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
(test)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--no-pub[Whether to run "flutter packages get" before executing this command. (defaults to on)]' \
'--name=[A regular expression matching substrings of the names of tests to run.]' \
'--plain-name=[A plain-text substring of the names of tests to run.]' \
'--start-paused[Start in a paused mode and wait for a debugger to connect. You must specify a single test file to run, explicitly. Instructions for connecting with a debugger and printed to the console once the test has started.]' \
'--coverage[Whether to collect coverage information.]' \
'--merge-coverage[Whether to merge coverage data with "coverage/lcov.base.info". Implies collecting coverage data. (Requires lcov)]' \
'--coverage-path[Where to store coverage information (if coverage is enabled). (defaults to "coverage/lcov.info")]' \
&& ret=0
(trace)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--start[Start tracing.]' \
'--stop[Stop tracing.]' \
'--out[Specify the path of the saved trace file.]' \
'(-d --duration)'{-d,--duration}'[Duration in seconds to trace. (defaults to "10")]' \
'--debug-port[Local port where the observatory is listening. (defaults to "8100")]' \
&& ret=0
;;
(update-packages)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
'--force-upgrade[Attempt to update all the dependencies to their latest versions. This will actually modify the pubspec.yaml files in your checkout.]' \
'--no-force-upgrade[Attempt to update all the dependencies to their latest versions. This will actually modify the pubspec.yaml files in your checkout.]' \
'--paths[Finds paths in the dependency chain leading from package specified in --from to package specified in --to.]' \
'--no-paths[Finds paths in the dependency chain leading from package specified in --from to package specified in --to.]' \
'--from[Used with flag --dependency-path. Specifies the package to begin searching dependency path from.]' \
'--to[Used with flag --dependency-path. Specifies the package that the sought after dependency path leads to.]' \
'--transitive-closure[Prints the dependency graph that is the transitive closure of packages the Flutter SDK depends on.]' \
'--no-transitive-closure[Prints the dependency graph that is the transitive closure of packages the Flutter SDK depends on.]' \
'--verify-only[verifies the package checksum without changing or updating deps]' \
'--no-verify-only[verifies the package checksum without changing or updating deps]' \
&& ret=0
;;
(upgrade)
_arguments -C \
'(-h --help)'{-h,--help}'[Print this usage information.]' \
&& ret=0
;;
esac
;;
esac
return ret
}
(( $+functions[root_commands] )) ||
_root_commands() {
local commands;
commands=(
"analyze:Analyze the project's Dart code."
'build:Flutter build commands.'
'channel:List or switch flutter channels.'
'clean:Delete the build/ directory.'
'config:Configure Flutter settings.'
'create:Create a new Flutter project.'
'daemon:Run a persistent, JSON-RPC based server to communicate with devices.'
'devices:List all connected devices.'
'doctor:Show information about the installed tooling.'
'drive:Runs Flutter Driver tests for the current project.'
'format:Format one or more dart files.'
'fuchsia_reload:Hot reload on Fuchsia.'
'help:Display help information for flutter.'
'ide-config:Configure the IDE for use in the Flutter tree.'
'inject-plugins:Re-generates the GeneratedPluginRegistrants.'
'install:Install a Flutter app on an attached device.'
'logs:Show log output for running Flutter apps.'
'packages:Commands for managing Flutter packages.'
"precache:Populates the Flutter tool's cache of binary artifacts."
'run:Run your Flutter app on an attached device.'
'screenshot:Take a screenshot from a connected device.'
'stop:Stop your Flutter app on an attached device.'
'test:Run Flutter unit tests for the current project.'
'trace:Start and stop tracing for a running Flutter app.'
'update-packages:Update the packages inside the Flutter repo.'
'upgrade:Upgrade your copy of Flutter.')
_describe -t commands 'command' commands "$@"
}
(( $+functions[_build_entities] )) ||
_build_entities() {
local entities;
entities=("aot:Build an ahead-of-time compiled snapshot of your app's Dart code."
"apk:Build an Android APK file from your app."
"flx:Build a Flutter FLX file from your app."
"ios:Build an iOS application bundle (Mac OS X host only).")
_describe -t entities 'entity' entities "$@"
}
(( $+functions[_project_templates] )) ||
_project_templates() {
local templates;
templates=("app:(default) Generate a Flutter application."
"package:Generate a shareable Flutter project containing modular Dart code."
"plugin:Generate a shareable Flutter project containing an API in Dart code with a platform-specific implementation for Android, for iOS code, or for both.")
_describe -t templates 'template' templates "$@"
}
(( $+functions[_ios_languages] )) ||
_ios_languages() {
local languages;
languages=("objc:(default) Objective-C."
"swift:Swift.")
_describe -t languages 'language' languages "$@"
}
(( $+functions[_droid_languages] )) ||
_droid_languages() {
local languages;
languages=("java:(default) Java."
"kotlin:Kotlin.")
_describe -t languages 'language' languages "$@"
}
(( $+functions[_target_platforms] )) ||
_target_platforms() {
local platforms;
platforms=("default:(default) default."
"android-arm:android-arm."
"android-arm64:android-arm64.")
_describe -t platforms 'platform' platforms "$@"
}
(( $+functions[_package_subcomands] )) ||
_package_subcomands() {
local subcommands;
subcommands=("get:Get packages in a Flutter project."
"pub:Pass the remaining arguments to Dart's 'pub' tool."
"test:Run the 'test' package."
"upgrade:Upgrade packages in a Flutter project.")
_describe -t subcommands 'subcommand' subcommands "$@"
}
_flutter "$@"

View File

@@ -92,15 +92,15 @@ _ghc_compiler ()
'-c[Stop after generating object files]' \ '-c[Stop after generating object files]' \
'-eventlog[Enable runtime event tracing]' \ '-eventlog[Enable runtime event tracing]' \
'-debug[Use the debugging runtime]' \ '-debug[Use the debugging runtime]' \
"-dylib-install-name[On Darwin/OS X only, set the install name]" \ "-dylib-install-name[On Darwin/macOS only, set the install name]" \
'-dynamic[Use dynamic Haskell libraries]' \ '-dynamic[Use dynamic Haskell libraries]' \
'-dynamic-too[Build dynamic object files as well as static object files during compilation]' \ '-dynamic-too[Build dynamic object files as well as static object files during compilation]' \
'-dynosuf[Set the output suffix for dynamic object files]' \ '-dynosuf[Set the output suffix for dynamic object files]' \
'-dynload[Select one of a number of modes for finding shared libraries at runtime]' \ '-dynload[Select one of a number of modes for finding shared libraries at runtime]' \
'--mk-dll[DLL-creation mode (Windows only)]' \ '--mk-dll[DLL-creation mode (Windows only)]' \
'-framework-path[On Darwin/OS X/iOS only, add dir to the list of directories searched for frameworks]' \ '-framework-path[On Darwin/macOS/iOS only, add dir to the list of directories searched for frameworks]' \
'-shared[Generate a shared library (as opposed to an executable)]' \ '-shared[Generate a shared library (as opposed to an executable)]' \
'-staticlib[On Darwin/OS X/iOS only, generate a standalone static library (as opposed to an executable)]' \ '-staticlib[On Darwin/macOS/iOS only, generate a standalone static library (as opposed to an executable)]' \
'-e[Evaluate expression]' \ '-e[Evaluate expression]' \
'-hide-all-packages[Hide all packages by default]' \ '-hide-all-packages[Hide all packages by default]' \
'-hpcdir[Directory to deposit .mix files during compilation (default is .hpc)]' \ '-hpcdir[Directory to deposit .mix files during compilation (default is .hpc)]' \

View File

@@ -41,23 +41,35 @@
_glances() { _glances() {
_arguments \ _arguments \
"-0[Divide task CPU usage by the total number of CPUs]" \
"-1[Start Glances in per CPU mode]" \
"-2[Disable left sidebar]" \
"-3[Disable quick look module]" \
"-4[Disable all but quick look and load]" \
"-5[Disable top menu]" \
"-6[Start Glances in mean GPU mode]" \
"-b[Display network rate in Byte per second]" \ "-b[Display network rate in Byte per second]" \
"-B[Bind server to the given IP or host NAME]:host:_hosts" \ "-B[Bind server to the given IP or host NAME]:host:_hosts" \
"-c[Connect to a Glances server]:host:_hosts" \ "-c[Connect to a Glances server]:host:_hosts" \
"-C[Path to the configuration file]:configuration path:_files -/" \ "-C[Path to the configuration file]:configuration path:_files -/" \
"-d[Disable disk I/O module]" \ "-d[Enable debug mode]" \
"-e[Enable the sensors module (Linux-only)]" \
"-f[Set the output folder (HTML) or file (CSV)]:output path:_files -/" \
"-h[Display the syntax and exit]" \ "-h[Display the syntax and exit]" \
"-m[Disable mount module]" \
"-n[Disable network module]" \
"-o[Define additional output (available: HTML or CSV)]:output type:(HTML CSV)" \ "-o[Define additional output (available: HTML or CSV)]:output type:(HTML CSV)" \
"-p[Define the client or server TCP port (default: 61209)]:port:_ports" \ "-p[Define the client or server TCP port (default: 61209)]:port:_ports" \
"-P[Client/server password]:password:" \ "-q[Disable the curses interface]" \
"-s[Run Glances in server mode]" \ "-s[Run Glances in server mode]" \
"-t[Set the refresh time in seconds (default: 3)]:seconds:" \ "-t[Set the refresh time in seconds (default: 3)]:seconds:" \
"-v[Display the version and exit]" \ "-V[Display the version and exit]" \
"-w[Run Glances in web server mode]" \
"-z[Do not use the bold color attribute]" \ "-z[Do not use the bold color attribute]" \
"--browser[Start the client browser]" \
"--disable-bg[Disable background colors in the terminal]" \
"--disable-bold[Disable bold mode in the terminal]" \
"--hide-kernel-threads[Hide kernel threads in process list]" \
"--password[Define a client/server password]" \
"--theme-white[Optimize display colors for white background]" \
"--tree[Display processes as tree]" \
"--username[Define a client/server username]" \
} }
_glances "$@" _glances "$@"

View File

@@ -1,554 +0,0 @@
#compdef go
# ------------------------------------------------------------------------------
# Copyright (c) 2016 Github zsh-users - http://github.com/zsh-users
# Copyright (c) 2013-2015 Robby Russell and contributors (see
# https://github.com/robbyrussell/oh-my-zsh/contributors)
# Copyright (c) 2010-2014 Go authors
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY DIRECT,
# INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for go 1.5 (http://golang.org).
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Mikkel Oscar Lyderik <mikkeloscar@gmail.com>
# * oh-my-zsh authors:
# https://github.com/robbyrussell/oh-my-zsh/blob/master/plugins/golang/golang.plugin.zsh
# * Go authors
#
# ------------------------------------------------------------------------------
typeset -A opt_args
_go() {
local -a commands build_flags
commands=(
'build:compile packages and dependencies'
'clean:remove object files'
'doc:show documentation for package or symbol'
'env:print Go environment information'
'fix:run go tool fix on packages'
'fmt:run gofmt on package sources'
'generate:generate Go files by processing source'
'get:download and install packages and dependencies'
'install:compile and install packages and dependencies'
'list:list packages'
'run:compile and run Go program'
'test:test packages'
'tool:run specified go tool'
'version:print Go version'
'vet:run go tool vet on packages'
'help:get more information about a command'
)
_arguments \
"1: :{_describe 'command' commands}" \
'*:: :->args'
case $state in
args)
build_flags=(
'-a[force rebuilding of packages that are already up-to-date]'
'-n[print the commands but do not run them]'
'-p[number of builds that can be run in parallel]:number'
'-race[enable data race detection]'
'-v[print the names of packages as they are compiled]'
'-work[print temporary work directory and keep it]'
'-x[print the commands]'
'-asmflags[arguments for each go tool asm invocation]:flags'
'-buildmode[build mode to use]:mode'
'-compiler[name of compiler to use]:name'
'-gccgoflags[arguments for gccgo]:args'
'-gcflags[arguments for each go tool compile invocation]:args'
'-installsuffix[suffix to add to package directory]:suffix'
'-ldflags[arguments to pass on each go tool link invocation.]:flags'
'-linkshared[link against shared libraries]'
'-pkgdir[install and load all packages from dir]:dir'
'-tags[list of build tags to consider satisfied]:tags'
'-toolexec[program to use to invoke toolchain programs]:args'
)
__go_packages() {
local gopaths
declare -a gopaths
gopaths=("${(s/:/)$(go env GOPATH)}")
gopaths+=("$(go env GOROOT)")
for p in $gopaths; do
_path_files -W "$p/src" -/
done
}
case $words[1] in
build)
_arguments \
'-o[force build to write to named output file]:file:_files' \
'-i[installs the packages that are dependencies of the target]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
clean)
_arguments \
'-i[remove corresponding installed archive or binary]' \
'-r[apply clean recursively on all dependencies]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
doc)
_arguments \
'-c[respect case when matching symbols]' \
'-cmd[treat a command (package main) like a regular package]' \
'-u[show docs for unexported and exported symbols and methods]'
;;
fix)
_arguments '*:importpaths:__go_packages'
;;
fmt)
_arguments \
'-n[prints commands that would be executed]' \
'-x[prints commands as they are executed]' \
'*:importpaths:__go_packages'
;;
generate)
_arguments \
'-run=[specifies a regular expression to select directives]:regex' \
'-x[print the commands]' \
'-n[print the commands but do not run them]' \
'-v[print the names of packages as they are compiled]' \
"*:args:{ _alternative ':importpaths:__go_packages' _files }"
;;
get)
_arguments \
'-d[instructs get to stop after downloading the packages]' \
'-f[force get -u not to verify that each package has been checked from vcs]' \
'-fix[run the fix tool on the downloaded packages]' \
'-insecure[permit fetching/resolving custom domains]' \
'-t[also download the packages required to build tests]' \
'-u[use the network to update the named packages]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
install)
_arguments ${build_flags[@]} \
'*:importpaths:__go_packages'
;;
list)
_arguments \
'-e[changes the handling of erroneous packages]' \
'-f[specifies an alternate format for the list]:format' \
'-json[causes package data to be printed in JSON format]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
run)
_arguments \
${build_flags[@]} \
'-exec[invoke the binary using xprog]:xporg' \
'*:file:_files -g "*.go(-.)"'
;;
test)
_arguments \
"-c[compile but don't run test]" \
'-i[install dependencies of the test]' \
'-bench[run benchmarks matching the regular expression]:regexp' \
'-benchmem[print memory allocation statistics for benchmarks]' \
'-benchtime[run benchmarks for t rime]:t' \
'-blockprofile[write a goroutine blocking profile to the specified file]:block' \
'-blockprofilerate[control goroutine blocking profiles]:n' \
'-count[run each test and benchmark n times]:n' \
'-cover[enable coverage analysis]' \
'-covermode[set the mode for coverage analysis]:mode:(set count atomic)' \
'-coverpkg[apply coverage analysis in each test of listed packages]:list' \
'-coverprofile[write a coverage profile to file]:cover' \
'-cpu[specify a list of GOMAXPROCS values]:cpus' \
'-cpuprofile[write a CPU profile to the specified file]:profile' \
'-memprofile[write a memory profile to file]:mem' \
'-memprofilerate[enable more precise memory profiles]:n' \
'-outputdir[place output files from profiling in output dir]:dir' \
'-parallel[allow parallel execution of test functions]:n' \
'-run[run tests and examples matching the regular expression]:regexp' \
'-short[tell long-running tests to shorten their run time]' \
'-timeout[timeout long running tests]:t' \
'-trace[write an execution trace to the specified file]:trace' \
'-v[verbose output]' \
${build_flags[@]} \
'-exec[run test binary using xprog]:xprog' \
'-o[compile test binary to named file]:file:_files' \
'*:importpaths:__go_packages'
;;
tool)
local -a tools
tools=($(go tool))
_arguments \
'-n[print command that would be executed]' \
"1: :{_describe 'tool' tools}" \
'*:: :->args'
case $state in
args)
case $words[1] in
addr2line)
_files
;;
asm)
_arguments \
'-D[predefined symbol with optional simple value]:value' \
'-I[include directory]:value' \
'-S[print assembly and machine code]' \
'-debug[dump instructions as they are parsed]' \
'-dynlink[support references to Go symbols]' \
'-o[output file]:string' \
'-shared[generate code that can be linked into a shared lib]' \
'-trimpath[remove prefix from recorded source file paths]:string'
;;
callgraph)
local -a algos graphs
algos=(
'static:static calls only'
'cha:Class Hierarchy Analysis'
'rta:Rapid Type Analysis'
'pta:inclusion-based Points-To Analysis'
)
graphs=(
'digraph:output in digraph format'
'graphviz:output in AT&T GraphViz (.dot) format'
)
_arguments \
'-algo=[call-graph construction algorithm]:algos:{ _describe "algos" algos }' \
"-test[include the package's tests in the analysis]" \
'-format=[format in which each call graph edge is displayed]:graphs:{ _describe "graphs" graphs }'
;;
cgo)
_arguments \
'-debug-define[print relevant #defines]' \
'-debug-gcc[print gcc invocations]' \
'-dynimport[if non-empty, print dynamic import data]:string' \
'-dynlinker[record dynamic linker information]' \
'-dynout[write -dynimport output to file]:file' \
'-dynpackage[set Go package for -dynimport output]:string' \
'-exportheader[where to write export header]:string' \
'-gccgo[generate files for use with gccgo]' \
'-gccgopkgpath[-fgo-pkgpath option used with gccgo]:string' \
'-gccgoprefix[-fgo-prefix option used with gccgo]:string' \
'-godefs[write Go definitions for C file to stdout]' \
'-import_runtime_cgo[import runtime/cgo in generated code]' \
'-import_syscall[import syscall in generated code]' \
'-importpath[import path of package being built]:path' \
'-objdir[object directory]:dir'
;;
compile)
_arguments \
'-%[debug non-static initializers]' \
'-+[compiling runtime]' \
"-A[for bootstrapping, allow 'any' type]" \
'-B[disable bounds checking]' \
'-D[set relative path for local imports]:path' \
'-E[debug symbol export]' \
'-I[add directory to import search path]:directory' \
'-K[debug missing line numbers]' \
'-L[use full (long) path in error messages]' \
'-M[debug move generation]' \
'-N[disable optimizations]' \
'-P[debug peephole optimizer]' \
'-R[debug register optimizer]' \
'-S[print assembly listing]' \
'-V[print compiler version]' \
'-W[debug parse tree after type checking]' \
'-asmhdr[write assembly header to file]:file' \
'-buildid[record id as the build id in the export metadata]:id' \
'-complete[compiling complete package (no C or assembly)]' \
'-cpuprofile[write cpu profile to file]:file' \
'-d[print debug information about items in list]:list' \
'-dynlink[support references to Go symbols]' \
'-e[no limit on number of errors reported]' \
'-f[debug stack frames]' \
'-g[debug code generation]' \
'-h[halt on error]' \
'-i[debug line number stack]' \
'-importmap[add definition of the form source=actual to import map]:definition' \
'-installsuffix[set pkg directory suffix]:suffix' \
'-j[debug runtime-initialized variables]' \
'-l[disable inlining]' \
'-largemodel[generate code that assumes a large memory model]' \
'-live[debug liveness analysis]' \
'-m[print optimization decisions]' \
'-memprofile[write memory profile to file]:file' \
'-memprofilerate[set runtime.MemProfileRate to rate]:rate' \
'-nolocalimports[reject local (relative) imports]' \
'-o[write output to file]:file' \
'-p[set expected package import path]:path' \
'-pack[write package file instead of object file]' \
'-r[debug generated wrappers]' \
'-race[enable race detector]' \
'-s[warn about composite literals that can be simplified]' \
'-shared[generate code that can be linked into a shared library]' \
'-trimpath[remove prefix from recorded source file paths]:prefix' \
'-u[reject unsafe code]' \
'-v[increase debug verbosity]' \
'-w[debug type checking]' \
'-wb[enable write barrier (default 1)]' \
'-x[debug lexer]' \
'-y[debug declarations in canned imports (with -d)]' \
'*:file:_files -g "*.go(-.)"'
;;
cover)
if (( CURRENT == 2 )); then
_arguments \
'-func=[output coverage profile information for each function]:string' \
'-html=[generate HTML representation of coverage profile]:file:_files' \
'-mode=[coverage mode]:mode:(set count atomic)'
return
fi
_arguments \
'-o[file for output]:file' \
'-var=[name of coverage variable to generate]:var' \
'*:file:_files -g "*.go(-.)"'
;;
doc)
_arguments \
'-c[respect case when matching symbols]' \
'-cmd[treat a command (package main) like a regular package]' \
'-u[show docs for unexported and exported symbols and methods]' \
;;
fix)
_arguments \
'-diff[display diffs instead of rewriting files]' \
'-force[force fixes to run even if the code looks updated]:string' \
'-r[restrict the rewrites]:string' \
'*:files:_files'
;;
link)
_arguments \
'-B[add an ELF NT_GNU_BUILD_ID note when using ELF]:note' \
'-C[check Go calls to C code]' \
'-D[set data segment address (default -1)]:address' \
'-E[set entry symbol name]:entry' \
'-H[set header type]:type' \
'-I[use linker as ELF dynamic linker]:linker' \
'-L[add specified directory to library path]:directory' \
'-R[set address rounding quantum (default -1)]:quantum' \
'-T[set text segment address (default -1)]:address' \
'-V[print version and exit]' \
'-W[disassemble input]' \
'-X[add string value definition]:definition' \
'-a[disassemble output]' \
'-buildid[record id as Go toolchain build id]:id' \
'-buildmode[set build mode]:mode' \
'-c[dump call graph]' \
'-cpuprofile[write cpu profile to file]:file' \
'-d[disable dynamic executable]' \
'-extld[use linker when linking in external mode]:linker' \
'-extldflags[pass flags to external linker]:flags' \
'-f[ignore version mismatch]' \
'-g[disable go package data checks]' \
'-h[halt on error]' \
'-installsuffix[set package directory suffix]:suffix' \
'-k[set field tracking symbol]:symbol' \
'-linkmode[set link mode]:mode:(internal external auto)' \
'-linkshared[link against installed Go shared libraries]' \
'-memprofile[write memory profile to file]:file' \
'-memprofilerate[set runtime.MemProfileRate to rate]:rate' \
'-n[dump symbol table]' \
'-o[write output to file]:file' \
'-r[set the ELF dynamic linker search path to dir1:dir2:...]:path' \
'-race[enable race detector]' \
'-s[disable symbol table]' \
'-shared[generate shared object (implies -linkmode external)]' \
'-tmpdir[use directory for temporary files]:directory' \
'-u[reject unsafe packages]' \
'-v[print link trace]' \
'-w[disable DWARF generation]' \
'*:files:_files'
;;
objdump)
_arguments \
'-s[only dump symbols matching this regexp]:regexp' \
'*:files:_files'
;;
pack)
_arguments '1:ops:(c p r t x)' '::verbose:(v)' ':files:_files'
;;
pprof)
_arguments \
'-callgrind[outputs a graph in callgrind format]' \
'-disasm=[output annotated assembly]:p' \
'-dot[outputs a graph in DOT format]' \
'-eog[visualize graph through eog]' \
'-evince[visualize graph through evince]' \
'-gif[outputs a graph image in GIF format]' \
'-gv[visualize graph through gv]' \
'-list=[output annotated source for functions matching regexp]:p' \
'-pdf[outputs a graph in PDF format]' \
'-peek=[output callers/callees of functions matching regexp]:p' \
'-png[outputs a graph image in PNG format]' \
'-proto[outputs the profile in compressed protobuf format]' \
'-ps[outputs a graph in PS format]' \
'-raw[outputs a text representation of the raw profile]' \
'-svg[outputs a graph in SVG format]' \
'-tags[outputs all tags in the profile]' \
'-text[outputs top entries in text form]' \
'-top[outputs top entries in text form]' \
'-tree[outputs a text rendering of call graph]' \
'-web[visualize graph through web browser]' \
'-weblist=[output annotated source in HTML]:p' \
'-output=[generate output on file f (stdout by default)]:f' \
'-functions[report at function level (default)]' \
'-files[report at source file level]' \
'-lines[report at source line level]' \
'-addresses[report at address level]' \
'-base[show delta from this profile]:profile' \
'-drop_negative[ignore negative differences]' \
'-cum[sort by cumulative data]' \
'-seconds=[length of time for dynamic profiles]:n' \
'-nodecount=[max number of nodes to show]:n' \
'-nodefraction=[hide nodes below <f>*total]:f' \
'-edgefraction=[hide edges below <f>*total]:f' \
'-sample_index[index of sample value to display]' \
'-mean[average sample value over first value]' \
'-inuse_space[display in-use memory size]' \
'-inuse_objects[display in-use object counts]' \
'-alloc_space[display allocated memory size]' \
'-alloc_objects[display allocated object counts]' \
'-total_delay[display total delay at each region]' \
'-contentions[display number of delays at each region]' \
'-mean_delay[display mean delay at each region]' \
'-runtime[show runtime call frames in memory profiles]' \
'-focus=[restricts to paths going through a node matching regexp]:r' \
'-ignore=[skips paths going through any nodes matching regexp]:r' \
'-tagfocus=[restrict to samples tagged with key:value matching regexp]:r' \
'-tagignore=[discard samples tagged with key:value matching regexp]' \
'-call_tree[generate a context-sensitive call tree]' \
'-unit=[convert all samples to unit u for display]:u' \
'-divide_by=[scale all samples by dividing them by f]:f' \
'-buildid=[override build id for main binary in profile]:id' \
'-tools=[search path for object-level tools]:path' \
'-help[help message]' \
'*:files:_files'
;;
trace)
_arguments \
'-http=[HTTP service address]:addr' \
'*:files:_files'
;;
vet)
_arguments \
'-all[check everything]' \
'-asmdecl[check assembly against Go declarations]' \
'-assign[check for useless assignments]' \
'-atomic[check for common mistaken usages of the sync/atomic]' \
'-bool[check for mistakes involving boolean operators]' \
'-buildtags[check that +build tags are valid]' \
'-composites[check that composite literals used field-keyed elements]' \
'-compositewhitelist[use composite white list]' \
'-copylocks[check that locks are not passed by value]' \
'-methods[check that canonically named methods are canonically defined]' \
'-nilfunc[check for comparisons between functions and nil]' \
'-printf[check printf-like invocations]' \
'-printfuncs[print function names to check]:string' \
'-rangeloops[check that range loop variables are used correctly]' \
'-shadow[check for shadowed variables]' \
'-shadowstrict[whether to be strict about shadowing]' \
'-shift[check for useless shifts]' \
'-structtags[check that struct field tags have canonical format]' \
'-tags[list of build tags to apply when parsing]:list' \
'-test[for testing only: sets -all and -shadow]' \
'-unreachable[check for unreachable code]' \
'-unsafeptr[check for misuse of unsafe.Pointer]' \
'-unusedfuncs[list of functions whose results must be used]:string' \
'-unusedresult[check for unused result of calls to functions in -unusedfuncs]' \
'-unusedstringmethods[list of methods whose results must be used]:string' \
'-v[verbose]' \
'*:files:_files'
;;
yacc)
_arguments \
'-o[output]:output' \
'-v[parsetable]:parsetable' \
'*:files:_files'
;;
esac
;;
esac
;;
vet)
_arguments \
'-n[print commands that would be executed]' \
'-x[prints commands as they are executed]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
help)
local -a topics
topics=(
'c:calling between Go and C'
'buildmode:description of build modes'
'filetype:file types'
'gopath:GOPATH environment variable'
'environment:environment variables'
'importpath:import path syntax'
'packages:description of package lists'
'testflag:description of testing flags'
'testfunc:description of testing functions'
)
_arguments "1: :{_describe 'command' commands -- topics}"
;;
esac
;;
esac
}
_go

View File

@@ -0,0 +1,646 @@
#compdef go
# ------------------------------------------------------------------------------
# Copyright (c) 2016 Github zsh-users - http://github.com/zsh-users
# Copyright (c) 2013-2015 Robby Russell and contributors (see
# https://github.com/robbyrussell/oh-my-zsh/contributors)
# Copyright (c) 2010-2014 Go authors
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY DIRECT,
# INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for go 1.11 (http://golang.org).
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Mikkel Oscar Lyderik Larsen <mikkeloscar@gmail.com>
# * oh-my-zsh authors:
# https://github.com/robbyrussell/oh-my-zsh/blob/master/plugins/golang/golang.plugin.zsh
# * Go authors
#
# ------------------------------------------------------------------------------
typeset -A opt_args
__go_buildmodes() {
local -a buildmodes
buildmodes=(
'archive[non-main packages into .a files]'
'c-archive[main package, plus all packages it imports, into a C archive file]'
'c-shared[main package, plus all packages it imports, into a C shared library]'
'default[main packages are built into executables and listed non-main packages are built into .a files]'
'shared[non-main packages into a single shared library that will be used when building with the -linkshared option]'
'exe[main packages into executables]'
'pie[main packages and everything they import into position independent executables (PIE)]'
'plugin[main packages, plus all packages that they import, into a Go plugin]'
)
_values 'mode' $buildmodes
}
local -a commands build_flags
commands=(
'bug:start a bug report'
'build:compile packages and dependencies'
'clean:remove object files and cached files'
'doc:show documentation for package or symbol'
'env:print Go environment information'
'fix:update packages to use new APIs'
'fmt:gofmt (reformat) package sources'
'generate:generate Go files by processing source'
'get:download and install packages and dependencies'
'install:compile and install packages and dependencies'
'list:list packages or modules'
'mod:module maintenance'
'run:compile and run Go program'
'test:test packages'
'tool:run specified go tool'
'version :print Go version'
'vet:report likely mistakes in packages'
'help:get more information about a command'
)
_arguments \
"1: :{_describe 'command' commands}" \
'*:: :->args'
case $state in
args)
build_flags=(
'-a[force rebuilding of packages that are already up-to-date]'
'-n[print the commands but do not run them]'
'-p[number of builds that can be run in parallel]:number'
'-race[enable data race detection]'
'-v[print the names of packages as they are compiled]'
'-work[print temporary work directory and keep it]'
'-x[print the commands]'
'-asmflags[arguments for each go tool asm invocation]:flags'
'-buildmode[build mode to use]:mode:__go_buildmodes'
'-compiler[name of compiler to use]:name'
'-gccgoflags[arguments for gccgo]:args'
'-gcflags[arguments for each go tool compile invocation]:args'
'-installsuffix[suffix to add to package directory]:suffix'
'-ldflags[arguments to pass on each go tool link invocation.]:flags'
'-linkshared[link against shared libraries]'
'-pkgdir[install and load all packages from dir]:dir'
'-tags[list of build tags to consider satisfied]:tags'
'-toolexec[program to use to invoke toolchain programs]:args'
)
__go_packages() {
local gopaths
declare -a gopaths
gopaths=("${(s/:/)$(go env GOPATH)}")
gopaths+=("$(go env GOROOT)")
for p in $gopaths; do
_path_files -W "$p/src" -/
done
}
case $words[1] in
build)
_arguments \
'-o[force build to write to named output file]:file:_files' \
'-i[installs the packages that are dependencies of the target]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
clean)
_arguments \
'-i[remove corresponding installed archive or binary]' \
'-r[apply clean recursively on all dependencies]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
doc)
_arguments \
'-c[respect case when matching symbols]' \
'-cmd[treat a command (package main) like a regular package]' \
'-u[show docs for unexported and exported symbols and methods]'
;;
fix)
_arguments '*:importpaths:__go_packages'
;;
fmt)
_arguments \
'-n[prints commands that would be executed]' \
'-x[prints commands as they are executed]' \
'*:importpaths:__go_packages'
;;
generate)
_arguments \
'-run=[specifies a regular expression to select directives]:regex' \
'-x[print the commands]' \
'-n[print the commands but do not run them]' \
'-v[print the names of packages as they are compiled]' \
"*:args:{ _alternative ':importpaths:__go_packages' _files }"
;;
get)
_arguments \
'-d[instructs get to stop after downloading the packages]' \
'-f[force get -u not to verify that each package has been checked from vcs]' \
'-fix[run the fix tool on the downloaded packages]' \
'-insecure[permit fetching/resolving custom domains]' \
'-t[also download the packages required to build tests]' \
'-u[use the network to update the named packages]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
install)
_arguments ${build_flags[@]} \
'*:importpaths:__go_packages'
;;
list)
_arguments \
'-e[changes the handling of erroneous packages]' \
'-f[specifies an alternate format for the list]:format' \
'-json[causes package data to be printed in JSON format]' \
'-compiled[set CompiledGoFiles to the Go source files presented to the compiler]' \
'-deps[iterate over named packages and their dependencies]' \
'-m[list modules instead of packages]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
mod)
local -a mod_commands
mod_commands=(
'download:download modules to local cache'
'edit:edit go.mod from tools or scripts'
'graph:print module requirement graph'
'init:initialize new module in current directory'
'tidy:add missing and remove unused modules'
'vendor:make vendored copy of dependencies'
'verify:verify dependencies have expected content'
'why:explain why packages or modules are needed'
'help:get more information about a command'
)
_arguments \
"1: :{_describe 'command' mod_commands}" \
'*:: :->args'
case $state in
args)
case $words[1] in
download)
_arguments \
'-json[print a sequance of JSON objects to standard output]'
;;
edit)
_arguments \
'-fmt[reformats the go.mod file without making other changes]' \
"-module[change the module's path]" \
'*-require=[add a requirement on the given module path and version]:require' \
'*-droprequire=[drop a requirement on the given module path and version]:droprequire' \
'*-exclude=[add an exclusion for the given module path and version]:exclude' \
'*-dropexclude=[drop an exclusion for the given module path and version]:dropexclude' \
'*-replace=[add a replacement of the given module path and version]:replace' \
'*-dropreplace=[drop a replacement of the given module path and version]:dropreplace' \
'-json[prints the final go.mod file in JSON format]' \
'-print[prints the final go.mod in its text format]' \
':go.mod:_path_files -g "go.mod"'
;;
graph)
;;
init)
# Use go packages as module name suggestion
_arguments \
'*:module:__go_packages'
;;
tidy)
_arguments \
'-v[print information about removed modules to standard error]'
;;
vendor)
_arguments \
'-v[print the names of vendored modules and packages to standard error]'
;;
verify)
;;
why)
_arguments \
'-m[treats the arguments as a list of modules]' \
'-vendor[exclude tests of dependencies]' \
'*:module:__go_packages'
;;
esac
;;
esac
;;
run)
_arguments \
${build_flags[@]} \
'-exec[invoke the binary using xprog]:xporg' \
'*:file:_files -g "*.go(-.)"'
;;
test)
_arguments \
"-c[compile but don't run test]" \
'-i[install dependencies of the test]' \
'-bench[run benchmarks matching the regular expression]:regexp' \
'-benchmem[print memory allocation statistics for benchmarks]' \
'-benchtime[run benchmarks for t rime]:t' \
'-blockprofile[write a goroutine blocking profile to the specified file]:block' \
'-blockprofilerate[control goroutine blocking profiles]:n' \
'-count[run each test and benchmark n times]:n' \
'-cover[enable coverage analysis]' \
'-covermode[set the mode for coverage analysis]:mode:(set count atomic)' \
'-coverpkg[apply coverage analysis in each test of listed packages]:list' \
'-coverprofile[write a coverage profile to file]:cover' \
'-cpu[specify a list of GOMAXPROCS values]:cpus' \
'-cpuprofile[write a CPU profile to the specified file]:profile' \
'-memprofile[write a memory profile to file]:mem' \
'-memprofilerate[enable more precise memory profiles]:n' \
'-outputdir[place output files from profiling in output dir]:dir' \
'-parallel[allow parallel execution of test functions]:n' \
'-run[run tests and examples matching the regular expression]:regexp' \
'-short[tell long-running tests to shorten their run time]' \
'-timeout[timeout long running tests]:t' \
'-trace[write an execution trace to the specified file]:trace' \
'-v[verbose output]' \
${build_flags[@]} \
'-exec[run test binary using xprog]:xprog' \
'-o[compile test binary to named file]:file:_files' \
'*:importpaths:__go_packages'
;;
tool)
local -a tools
tools=($(go tool))
_arguments \
'-n[print command that would be executed]' \
"1: :{_describe 'tool' tools}" \
'*:: :->args'
case $state in
args)
case $words[1] in
addr2line)
_files
;;
asm)
_arguments \
'-D[predefined symbol with optional simple value]:value' \
'-I[include directory]:value' \
'-S[print assembly and machine code]' \
'-debug[dump instructions as they are parsed]' \
'-dynlink[support references to Go symbols]' \
'-o[output file]:string' \
'-shared[generate code that can be linked into a shared lib]' \
'-trimpath[remove prefix from recorded source file paths]:string'
;;
callgraph)
local -a algos graphs
algos=(
'static:static calls only'
'cha:Class Hierarchy Analysis'
'rta:Rapid Type Analysis'
'pta:inclusion-based Points-To Analysis'
)
graphs=(
'digraph:output in digraph format'
'graphviz:output in AT&T GraphViz (.dot) format'
)
_arguments \
'-algo=[call-graph construction algorithm]:algos:{ _describe "algos" algos }' \
"-test[include the package's tests in the analysis]" \
'-format=[format in which each call graph edge is displayed]:graphs:{ _describe "graphs" graphs }'
;;
cgo)
_arguments \
'-debug-define[print relevant #defines]' \
'-debug-gcc[print gcc invocations]' \
'-dynimport[if non-empty, print dynamic import data]:string' \
'-dynlinker[record dynamic linker information]' \
'-dynout[write -dynimport output to file]:file' \
'-dynpackage[set Go package for -dynimport output]:string' \
'-exportheader[where to write export header]:string' \
'-gccgo[generate files for use with gccgo]' \
'-gccgopkgpath[-fgo-pkgpath option used with gccgo]:string' \
'-gccgoprefix[-fgo-prefix option used with gccgo]:string' \
'-godefs[write Go definitions for C file to stdout]' \
'-import_runtime_cgo[import runtime/cgo in generated code]' \
'-import_syscall[import syscall in generated code]' \
'-importpath[import path of package being built]:path' \
'-objdir[object directory]:dir'
;;
compile)
_arguments \
'-%[debug non-static initializers]' \
'-+[compiling runtime]' \
"-A[for bootstrapping, allow 'any' type]" \
'-B[disable bounds checking]' \
'-D[set relative path for local imports]:path' \
'-E[debug symbol export]' \
'-I[add directory to import search path]:directory' \
'-K[debug missing line numbers]' \
'-L[use full (long) path in error messages]' \
'-M[debug move generation]' \
'-N[disable optimizations]' \
'-P[debug peephole optimizer]' \
'-R[debug register optimizer]' \
'-S[print assembly listing]' \
'-V[print compiler version]' \
'-W[debug parse tree after type checking]' \
'-asmhdr[write assembly header to file]:file' \
'-buildid[record id as the build id in the export metadata]:id' \
'-complete[compiling complete package (no C or assembly)]' \
'-cpuprofile[write cpu profile to file]:file' \
'-d[print debug information about items in list]:list' \
'-dynlink[support references to Go symbols]' \
'-e[no limit on number of errors reported]' \
'-f[debug stack frames]' \
'-g[debug code generation]' \
'-h[halt on error]' \
'-i[debug line number stack]' \
'-importmap[add definition of the form source=actual to import map]:definition' \
'-installsuffix[set pkg directory suffix]:suffix' \
'-j[debug runtime-initialized variables]' \
'-l[disable inlining]' \
'-largemodel[generate code that assumes a large memory model]' \
'-live[debug liveness analysis]' \
'-m[print optimization decisions]' \
'-memprofile[write memory profile to file]:file' \
'-memprofilerate[set runtime.MemProfileRate to rate]:rate' \
'-nolocalimports[reject local (relative) imports]' \
'-o[write output to file]:file' \
'-p[set expected package import path]:path' \
'-pack[write package file instead of object file]' \
'-r[debug generated wrappers]' \
'-race[enable race detector]' \
'-s[warn about composite literals that can be simplified]' \
'-shared[generate code that can be linked into a shared library]' \
'-trimpath[remove prefix from recorded source file paths]:prefix' \
'-u[reject unsafe code]' \
'-v[increase debug verbosity]' \
'-w[debug type checking]' \
'-wb[enable write barrier (default 1)]' \
'-x[debug lexer]' \
'-y[debug declarations in canned imports (with -d)]' \
'*:file:_files -g "*.go(-.)"'
;;
cover)
if (( CURRENT == 2 )); then
_arguments \
'-func=[output coverage profile information for each function]:string' \
'-html=[generate HTML representation of coverage profile]:file:_files' \
'-mode=[coverage mode]:mode:(set count atomic)'
return
fi
_arguments \
'-o[file for output]:file' \
'-var=[name of coverage variable to generate]:var' \
'*:file:_files -g "*.go(-.)"'
;;
doc)
_arguments \
'-c[respect case when matching symbols]' \
'-cmd[treat a command (package main) like a regular package]' \
'-u[show docs for unexported and exported symbols and methods]' \
;;
fix)
_arguments \
'-diff[display diffs instead of rewriting files]' \
'-force[force fixes to run even if the code looks updated]:string' \
'-r[restrict the rewrites]:string' \
'*:files:_files'
;;
link)
_arguments \
'-B[add an ELF NT_GNU_BUILD_ID note when using ELF]:note' \
'-C[check Go calls to C code]' \
'-D[set data segment address (default -1)]:address' \
'-E[set entry symbol name]:entry' \
'-H[set header type]:type' \
'-I[use linker as ELF dynamic linker]:linker' \
'-L[add specified directory to library path]:directory' \
'-R[set address rounding quantum (default -1)]:quantum' \
'-T[set text segment address (default -1)]:address' \
'-V[print version and exit]' \
'-W[disassemble input]' \
'-X[add string value definition]:definition' \
'-a[disassemble output]' \
'-buildid[record id as Go toolchain build id]:id' \
'-buildmode[set build mode]:mode' \
'-c[dump call graph]' \
'-cpuprofile[write cpu profile to file]:file' \
'-d[disable dynamic executable]' \
'-extld[use linker when linking in external mode]:linker' \
'-extldflags[pass flags to external linker]:flags' \
'-f[ignore version mismatch]' \
'-g[disable go package data checks]' \
'-h[halt on error]' \
'-installsuffix[set package directory suffix]:suffix' \
'-k[set field tracking symbol]:symbol' \
'-linkmode[set link mode]:mode:(internal external auto)' \
'-linkshared[link against installed Go shared libraries]' \
'-memprofile[write memory profile to file]:file' \
'-memprofilerate[set runtime.MemProfileRate to rate]:rate' \
'-n[dump symbol table]' \
'-o[write output to file]:file' \
'-r[set the ELF dynamic linker search path to dir1:dir2:...]:path' \
'-race[enable race detector]' \
'-s[disable symbol table]' \
'-shared[generate shared object (implies -linkmode external)]' \
'-tmpdir[use directory for temporary files]:directory' \
'-u[reject unsafe packages]' \
'-v[print link trace]' \
'-w[disable DWARF generation]' \
'*:files:_files'
;;
objdump)
_arguments \
'-s[only dump symbols matching this regexp]:regexp' \
'*:files:_files'
;;
pack)
_arguments '1:ops:(c p r t x)' '::verbose:(v)' ':files:_files'
;;
pprof)
_arguments \
'-callgrind[outputs a graph in callgrind format]' \
'-disasm=[output annotated assembly]:p' \
'-dot[outputs a graph in DOT format]' \
'-eog[visualize graph through eog]' \
'-evince[visualize graph through evince]' \
'-gif[outputs a graph image in GIF format]' \
'-gv[visualize graph through gv]' \
'-list=[output annotated source for functions matching regexp]:p' \
'-pdf[outputs a graph in PDF format]' \
'-peek=[output callers/callees of functions matching regexp]:p' \
'-png[outputs a graph image in PNG format]' \
'-proto[outputs the profile in compressed protobuf format]' \
'-ps[outputs a graph in PS format]' \
'-raw[outputs a text representation of the raw profile]' \
'-svg[outputs a graph in SVG format]' \
'-tags[outputs all tags in the profile]' \
'-text[outputs top entries in text form]' \
'-top[outputs top entries in text form]' \
'-tree[outputs a text rendering of call graph]' \
'-web[visualize graph through web browser]' \
'-weblist=[output annotated source in HTML]:p' \
'-output=[generate output on file f (stdout by default)]:f' \
'-functions[report at function level (default)]' \
'-files[report at source file level]' \
'-lines[report at source line level]' \
'-addresses[report at address level]' \
'-base[show delta from this profile]:profile' \
'-drop_negative[ignore negative differences]' \
'-cum[sort by cumulative data]' \
'-seconds=[length of time for dynamic profiles]:n' \
'-nodecount=[max number of nodes to show]:n' \
'-nodefraction=[hide nodes below <f>*total]:f' \
'-edgefraction=[hide edges below <f>*total]:f' \
'-sample_index[index of sample value to display]' \
'-mean[average sample value over first value]' \
'-inuse_space[display in-use memory size]' \
'-inuse_objects[display in-use object counts]' \
'-alloc_space[display allocated memory size]' \
'-alloc_objects[display allocated object counts]' \
'-total_delay[display total delay at each region]' \
'-contentions[display number of delays at each region]' \
'-mean_delay[display mean delay at each region]' \
'-runtime[show runtime call frames in memory profiles]' \
'-focus=[restricts to paths going through a node matching regexp]:r' \
'-ignore=[skips paths going through any nodes matching regexp]:r' \
'-tagfocus=[restrict to samples tagged with key:value matching regexp]:r' \
'-tagignore=[discard samples tagged with key:value matching regexp]' \
'-call_tree[generate a context-sensitive call tree]' \
'-unit=[convert all samples to unit u for display]:u' \
'-divide_by=[scale all samples by dividing them by f]:f' \
'-buildid=[override build id for main binary in profile]:id' \
'-tools=[search path for object-level tools]:path' \
'-help[help message]' \
'*:files:_files'
;;
trace)
_arguments \
'-http=[HTTP service address]:addr' \
'*:files:_files'
;;
vet)
_arguments \
'-all[check everything]' \
'-asmdecl[check assembly against Go declarations]' \
'-assign[check for useless assignments]' \
'-atomic[check for common mistaken usages of the sync/atomic]' \
'-bool[check for mistakes involving boolean operators]' \
'-buildtags[check that +build tags are valid]' \
'-composites[check that composite literals used field-keyed elements]' \
'-compositewhitelist[use composite white list]' \
'-copylocks[check that locks are not passed by value]' \
'-methods[check that canonically named methods are canonically defined]' \
'-nilfunc[check for comparisons between functions and nil]' \
'-printf[check printf-like invocations]' \
'-printfuncs[print function names to check]:string' \
'-rangeloops[check that range loop variables are used correctly]' \
'-shadow[check for shadowed variables]' \
'-shadowstrict[whether to be strict about shadowing]' \
'-shift[check for useless shifts]' \
'-structtags[check that struct field tags have canonical format]' \
'-tags[list of build tags to apply when parsing]:list' \
'-test[for testing only: sets -all and -shadow]' \
'-unreachable[check for unreachable code]' \
'-unsafeptr[check for misuse of unsafe.Pointer]' \
'-unusedfuncs[list of functions whose results must be used]:string' \
'-unusedresult[check for unused result of calls to functions in -unusedfuncs]' \
'-unusedstringmethods[list of methods whose results must be used]:string' \
'-v[verbose]' \
'*:files:_files'
;;
yacc)
_arguments \
'-o[output]:output' \
'-v[parsetable]:parsetable' \
'*:files:_files'
;;
esac
;;
esac
;;
vet)
_arguments \
'-n[print commands that would be executed]' \
'-x[prints commands as they are executed]' \
${build_flags[@]} \
'*:importpaths:__go_packages'
;;
help)
local -a topics
topics=(
'buildmode:build modes'
'c:calling between Go and C'
'cache:build and test caching'
'environment:environment variables'
'filetype:file types'
'go.mod:the go.mod file'
'gopath:GOPATH environment variable'
'gopath-get:legacy GOPATH go get'
'goproxy:module proxy protocol'
'importpath:import path syntax'
'modules:modules, module versions, and more'
'module-get:module-aware go get'
'packages:package lists and patterns'
'testflag:testing flags'
'testfunc:testing functions'
)
_arguments "1: :{_describe 'command' commands -- topics}"
;;
esac
;;
esac

View File

@@ -0,0 +1,286 @@
#compdef hledger
# ------------------------------------------------------------------------------
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for hleder 1.10 ( http://hledger.org/ )
# Last updated: 07.08.2018
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Valodim ( https://github.com/Valodim )
# * fdw ( https://github.com/fdw )
#
# ------------------------------------------------------------------------------
# Notes
# -----
#
# account completion depends on availability of a ~/.hledger.journal file
#
# ------------------------------------------------------------------------------
local curcontext="$curcontext" curstate state line expl grp cmd ret=1
typeset -a args
typeset -A opt_args
args=(
'(- *)'{-h,--help}'[print help information]'
'(-f --file)'{-f,--file}'=[use a different input file]:input file:_files'
'--rules-file=[CSV conversion rules file]:rules file:_files'
'--alias=[display accounts named OLD as NEW]:alias specification'
'--anon=[anonymize accounts and payees]'
'(-b --begin)'{-b,--begin}'=[include postings/txns on or after this date]:date'
'(-e --end)'{-e,--end}'=[include postings/txns before this date]:date'
'(-D --daily)'{-D,--daily}'[multiperiod/multicolumn report by day]'
'(-W --weekly)'{-W,--weekly}'[multiperiod/multicolumn report by week]'
'(-M --monthly)'{-M,--monthly}'[multiperiod/multicolumn report by month]'
'(-Q --quarterly)'{-Q,--quarterly}'[multiperiod/multicolumn report by quarter]'
'(-Y --yearly)'{-Y,--yearly}'[multiperiod/multicolumn report by year]'
'(-p --period)'{-p,--period}'=[set start date, end date, and/or reporting interval all at once]'
'(-C --cleared)'{-C,--cleared}'[include only cleared postings/txns]'
'(-U --uncleared)'{-U,--uncleared}'[include only uncleared postings/txns]'
'(-R --real)'{-R,--real}'[include only non-virtual postings]'
'(--depth)--depth=[hide accounts/postings deeper than N]:depth'
'(-E --empty)'{-E,--empty}'[show empty/zero things which are normally omitted]'
'(-B --cost)'{-B,--cost}'[show amounts in their cost price'\''s commodity]'
'(-V --value)'{-V,--value}'[converts reported amounts to the current market value]'
'(-I --ignore-assertions)'{-I,--ignore-assertions}'[ignore any failing balance assertions]'
'--forecast=[apply periodic transaction rules to generate future transactions]'
)
_arguments -C "$args[@]" -A "-*" \
'(- *)--version[print version information]' \
'--debug[show debug output]' \
'1: :->cmds' \
'*:: :->args' && ret=0
while (( $#state )); do
curstate=$state
shift state
case $curstate in
cmds)
typeset -a cmds
cmds=(
'accounts:show account names (a)'
'activity:show an ascii barchart of posting counts per interval'
'add:prompt for transactions and add them to the journal'
'balance:show accounts and balances (b, bal)'
'balancesheet:show a balance sheet (bs)'
'balancesheetequity:like balancesheet, but also reports equity'
'cashflow:show a cashflow statement (cf)'
'check-dates:check that transactions are sorted by increasing date'
'check-dupes:report account names having the same leaf but different prefixes'
'close:print closing/opening transactions that bring some or all account balances to zero and back'
'help:show any of the hledger manuals'
'import:read new transactions added to each file since last run, and add them to the main journal file'
'incomestatement:show an income statement (is)'
'prices:print market price directives from the journal'
'print:show transaction entries (p, txns)'
'print-unique:print transactions which do not reuse an already-seen description'
'register:show postings and running total (r, reg)'
'register-patch:print the one posting whose transaction description is closest to the description'
'rewrite:print all transactions, adding custom postings to the matched ones'
'stats:show some journal statistics'
'tags:list all the tag names used in the journal'
'test:run built-in unit tests'
)
_describe 'subcommands' cmds && ret=0
;;
args)
: $words
local cmd=$words[1]
(( $+cmd )) || return 1
# curcontext="${curcontext%:*:*}:$service-$cmd:"
case $cmd in
accounts)
args=(
'(--declared)--declared[show account names declared with account directives]'
'(--used)--used[show account names posted to by transactions]'
'(--tree)--tree[show accounts as a tree (default in simple reports)]'
'(--flat)--flat[show accounts as a list (default in multicolumn)]'
'(--drop)--drop=[flat mode, omit N leading account name parts]:drop n'
)
;;
activity)
;;
add)
args=(
'(--no-new-accounts)--no-new-accounts=[do not allow creating new accounts]'
)
;;
bal|balance)
args+=(
'(--change)--change[show balance change in each period (default)]'
'(--cumulative)--cumulative[show balance change accumulated across periods]'
'(-H --historical)'{-H,--historical}'[show historical ending balance in each period]'
'(--tree)--tree[show accounts as a tree (default in simple reports)]'
'(--flat)--flat[show accounts as a list (default in multicolumn)]'
'(-A --average)'{-A,--average}'[show a row average column (in multicolumn mode)]'
'(-T --row-total)'{-T,--row-total}'[show a row total column]'
'(-N --no-total)'{-N,--no-total}'[do not show the final total row]'
'(--drop)--drop=[in flat mode, omit N leading account name parts]:drop n'
'(--no-elide)--no-elide[tree mode, do not squash boring parent accounts]'
'(--format)--format=[in tree mode, use this custom line format]:custom line format'
'(-O --output-format)'{-O,--output-format}='[select the output format from txt, csv, html]:format'
'(-o --output-file)'{-o,--output-file}'=[write output to file]:file'
'(--pretty-tables)--pretty-tables[use unicode to display prettier tables]'
'(--sort-amount)--sort-amount[sort by amount instead of account code/name]'
'(--invert)--invert[display all amounts with reversed sign]'
'(--budget)--budget[show performance compared to budget goals]'
'(--show-unbudgeted)--show-unbudgeted[with --budget, show unbudgeted accounts also]'
)
;;
bl|balancesheet|balancesheetequity)
args+=(
'(--change)--change[show balance change in each period (default)]'
'(--cumulative)--cumulative[show balance change accumulated across periods]'
'(-H --historical)'{-H,--historical}'[show historical ending balance in each period]'
'(--tree)--tree[show accounts as a tree (default in simple reports)]'
'(--flat)--flat[show accounts as a list (default in multicolumn)]'
'(-A --average)'{-A,--average}'[show a row average column (in multicolumn mode)]'
'(-T --row-total)'{-T,--row-total}'[show a row total column]'
'(-N --no-total)'{-N,--no-total}'[do not show the final total row]'
'(--drop)--drop=[in flat mode, omit N leading account name parts]:drop n'
'(--no-elide)--no-elide[tree mode, do not squash boring parent accounts]'
'(--format)--format=[in tree mode, use this custom line format]:custom line format'
'(--sort-amount)--sort-amount[sort by amount instead of account code/name]'
)
;;
cashflow|cf|balancesheet|bs|incomestatement|is)
args+=(
'(--change)--change[show balance change in each period (default)]'
'(--cumulative)--cumulative[show balance change accumulated across periods]'
'(-H --historical)'{-H,--historical}'[show historical ending balance in each period]'
'(--tree)--tree[show accounts as a tree (default in simple reports)]'
'(--flat)--flat[show accounts as a list (default in multicolumn)]'
'(-A --average)'{-A,--average}'[show a row average column (in multicolumn mode)]'
'(-T --row-total)'{-T,--row-total}'[show a row total column]'
'(-N --no-total)'{-N,--no-total}'[do not show the final total row]'
'(--drop)--drop=[in flat mode, omit N leading account name parts]:drop n'
'(--no-elide)--no-elide[tree mode, do not squash boring parent accounts]'
'(--format)--format=[in tree mode, use this custom line format]:custom line format'
'(--sort-amount)--sort-amount[sort by amount instead of account code/name]'
)
;;
import)
args=(
'(--dry-run)--dry-run[just show the transactions to be imported]'
)
;;
is|incomestatement)
args+=(
'(--change)--change[show balance change in each period (default)]'
'(--cumulative)--cumulative[show balance change accumulated across periods]'
'(-H --historical)'{-H,--historical}'[show historical ending balance in each period]'
'(--tree)--tree[show accounts as a tree (default in simple reports)]'
'(--flat)--flat[show accounts as a list (default in multicolumn)]'
'(-A --average)'{-A,--average}'[show a row average column (in multicolumn mode)]'
'(-T --row-total)'{-T,--row-total}'[show a row total column]'
'(-N --no-total)'{-N,--no-total}'[do not show the final total row]'
'(--drop)--drop=[in flat mode, omit N leading account name parts]:drop n'
'(--no-elide)--no-elide[tree mode, do not squash boring parent accounts]'
'(--format)--format=[in tree mode, use this custom line format]:custom line format'
'(--sort-amount)--sort-amount[sort by amount instead of account code/name]'
)
;;
print)
args=(
'(-m --match)'{-m,--match}'[show the transaction whose description is most similar]:string'
'(--new)--new[show only newer-dated transactions added in each file since last run]'
'(-x --explicit)'{-x,--explicit}'[show all amounts explicitly]'
'(-O --output-format)'{-O,--output-format}='[select the output format from txt, csv, html]:format'
'(-o --output-file)'{-o,--output-file}'=[write output to file]:file'
)
;;
register|reg)
args+=(
'(--cumulative)--cumulative[show balance change accumulated across periods]'
'(-H --historical)'{-H,--historical}'[show historical ending balance in each period]'
'(-A --average)'{-A,--average}'[show a row average column (in multicolumn mode)]'
'(-r --related)'{-r,--related}'[show postings'\'' siblings instead]'
'(-w --width)'{-w,--width}'=[set output width to 120, or N]:width (default 80)'
'(-O --output-format)'{-O,--output-format}='[select the output format from txt, csv, html]:format'
'(-o --output-file)'{-o,--output-file}'=[write output to file]:file'
)
;;
stats)
args=(
'(-o --output-file)'{-o,--output-file}'=[write output to file]:file'
)
;;
# fallback to _default
*) _arguments -C -A "-*" "$args[@]" \
'*: :_default' && ret=0
continue
esac
_arguments -C -A "-*" "$args[@]" \
'*:query patterns:->query' && ret=0
;;
query)
local -a accs keywords
keywords=(
'acct\::match account names'
'code\::match by transaction code'
'desc\::match transaction descriptions'
'tag\::match by tag name'
'depth\::match at or above depth'
'status\::match cleared/uncleared transactions'
'real\::match real/virtual transactions'
'empty\::match if amount is/is not zero'
'amt\::match transaction amount'
'cur\::match by currency'
)
if compset -P 'amt:'; then
_message 'match amount (<, <=, >, >=, add sign for non-absolute match)' && ret=0
continue
fi
if compset -P '(#b)(code|desc|tag|depth|status|real|empty):'; then
_message "'$match[1]' parameter" && ret=0
continue
fi
accs=( ${(f)"$(_call_program hledger hledger accounts $PREFIX 2>/dev/null)"} )
if (( $? )); then
_message "error fetching accounts from hledger"
fi
# decided against partial matching here. these lines can
# be uncommented to complete subaccounts hierarchically
# (add -S '' -q to the compadd below, too)
# if compset -P '(#b)(*):'; then
# accs=( ${(M)accs:#$match[1]:*} )
# accs=( ${accs#$IPREFIX} )
# fi
# accs=( ${accs%%:*} )
_wanted accounts expl "accounts" compadd -a accs && ret=0
_describe "matcher keywords" keywords -S '' && ret=0
# not is special, it doesn't need the -S ''
keywords=(
'not:negate expression'
)
_describe "matcher keywords" keywords && ret=0
;;
esac
done
return ret

View File

@@ -0,0 +1,65 @@
#compdef include-what-you-use
# Copyright 2018 CERN for the benefit of the LHCb Collaboration.
# All rights reserved.
#
# Developed by:
#
# CERN LBC group
#
# CERN
#
# http://cern.ch
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# with the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# * Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimers.
#
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimers in the
# documentation and/or other materials provided with the distribution.
#
# * Neither the names of the LBC group, CERN, nor the names of its
# contributors may be used to endorse or promote products derived from
# this Software without specific prior written permission.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# CONTRIBUTORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS WITH
# THE SOFTWARE.
#
# In applying this licence, CERN does not waive the privileges and immunities
# granted to it by virtue of its status as an Intergovernmental Organization or
# submit itself to any jurisdiction.
# TODO:
# - prevent _iwyu_opts from running once the first clang option got passed
_iwyu_opts() {
_arguments '--check_also=[print iwyu-violation info for files matching the given glob pattern]:glob pattern:' \
'--cwd=[specify the current working directory]:current working directory:_path_files -/' \
'--howtodebug[print instructions on how to run iwyu under gdb]' \
'--howtodebug=[print instructions on how to run iwyu under gdb if file matches argument]:file for debug run:_path_files' \
'*'"--mapping_file=[iwyu mapping file]:iwyu mapping file:_path_files -g '*(/) *.imp'" \
"--no_default_mappings[do not add iwyu's default mappings]" \
'--pch_in_code[mark the first include in a translation unit as a precompiled header]' \
'--prefix_header_includes=[what to do with command line includes]:command line include treatment:(add keep remove)' \
"--transitive_includes_only[do not suggest that a file add headers that aren't already visible]" \
'--max_line_length=[maximum line length for includes]:a number:' \
'--no_comments[do not add "why" comments]' \
'--no_fwd_decls[do not use forward declarations]' \
'--verbose=[the higher the level, the more output]:a number:'
}
_arguments "*-Xiwyu[include-what-you-use options]:include-what-you-use options:_iwyu_opts"
# gcc will also provide --version and --help. Not ideal.
_gcc

View File

@@ -42,8 +42,10 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_kak_sessions() { _kak_sessions() {
session_ids=($(_call_program session_ids kak -l)) local -a session_ids expl
_values "${session_ids[@]}" session_ids=($(_call_program session_names kak -l))
_description session-ids expl "session name"
compadd "$expl[@]" -a session_ids
} }
_kak() { _kak() {

View File

@@ -1,6 +1,6 @@
#compdef node #compdef node
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# Copyright (c) 2011 Github zsh-users - http://github.com/zsh-users # Copyright (c) 2018 Github zsh-users - http://github.com/zsh-users
# All rights reserved. # All rights reserved.
# #
# Redistribution and use in source and binary forms, with or without # Redistribution and use in source and binary forms, with or without
@@ -28,7 +28,7 @@
# Description # Description
# ----------- # -----------
# #
# Completion script for Node.js v0.8.4 (http://nodejs.org) # Completion script for Node.js v10.4.1 (https://nodejs.org)
# #
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
# Authors # Authors
@@ -36,24 +36,64 @@
# #
# * Mario Fernandez (https://github.com/sirech) # * Mario Fernandez (https://github.com/sirech)
# * Nicholas Penree (https://github.com/drudge) # * Nicholas Penree (https://github.com/drudge)
# * Masafumi Koba (https://github.com/ybiquitous)
# #
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_node_files() {
_files -g "*.(js|mjs)"
}
local curcontext="$curcontext" state line ret=1 local curcontext="$curcontext" state line ret=1
typeset -A opt_args typeset -A opt_args
_arguments -C \ _arguments -C \
'(- 1 *)--help[print options help]' \ '-[script read from stdin (default; interactive mode if a tty)]' \
'(- 1 *)'{-v,--version}'[print node version]' \ '--[indicate the end of node options]' \
'(--no-deprecation)--no-deprecation[silence deprecation warnings]' \ '--abort-on-uncaught-exception[aborting instead of exiting causes a core file to be generated for analysis]' \
'(--trace-deprecation)--trace-deprecation[show stack traces on deprecations]' \ '--experimental-modules[experimental ES Module support and caching modules]' \
'--experimental-repl-await[experimental await keyword support in REPL]' \
'--experimental-vm-modules[experimental ES Module support in vm module]' \
'--icu-data-dir=[set ICU data load path to dir (overrides NODE_ICU_DATA) note: linked-in ICU data is present]: :_directories' \
'--inspect-brk=-[activate inspector on host:port and break at start of user script]:[host\:]port' \
'--inspect-port=[set host:port for inspector]:[host\:]port' \
'--inspect=-[activate inspector on host:port (default: 127.0.0.1:9229)]:[host\:]port' \
'--napi-modules[load N-API modules (no-op - option kept for compatibility)]' \
'--no-deprecation[silence deprecation warnings]' \
'--no-force-async-hooks-checks[disable checks for async_hooks]' \
'--no-warnings[silence all process warnings]' \
'--openssl-config=[load OpenSSL configuration from the specified file (overrides OPENSSL_CONF)]:file:_files' \
'--pending-deprecation[emit pending deprecation warnings]' \
'--preserve-symlinks[preserve symbolic links when resolving]' \
'--preserve-symlinks-main[preserve symbolic links when resolving the main module]' \
'--prof[generate V8 profiler output]' \
'--prof-process[process V8 profiler output generated using --prof]' \
'--redirect-warnings=[write warnings to file instead of stderr]: :_files' \
'--throw-deprecation[throw an exception on deprecations]' \
'--tls-cipher-list=[use an alternative default TLS cipher list]:cipher list string' \
'--trace-deprecation[show stack traces on deprecations]' \
'--trace-event-categories[comma separated list of trace event categories to record]: :{_values -s , categories node node.async_hooks node.bootstrap node.perf node.perf.usertiming node.perf.timerify node.fs.sync node.vm.script v8}' \
'--trace-event-file-pattern[Template string specifying the filepath for the trace-events data, it supports ${rotation} and ${pid} log-rotation id. %2$u is the pid.]:template string' \
'--trace-events-enabled[track trace events]' \
'--trace-sync-io[show stack trace when use of sync IO is detected after the first tick]' \
'--trace-warnings[show stack traces on process warnings]' \
'--track-heap-objects[track heap object allocations for heap snapshots]' \
'--use-bundled-ca[use bundled CA store (default)]' \
"--use-openssl-ca[use OpenSSL's default CA store]" \
'(- 1 *)--v8-options[print v8 command line options]' \ '(- 1 *)--v8-options[print v8 command line options]' \
'(--max-stack-size)--max-stack-size=[set max v8 stack size (bytes)]' \ "--v8-pool-size=[set v8's thread pool size]:number" \
'(-e --eval)'{-e,--eval}'[evaluate script]:Inline Script' \ '--zero-fill-buffers[automatically zero-fill all newly allocated Buffer and SlowBuffer instances]' \
'(-i --interactive)'{-i,--interactive}'[always enter the REPL even if stdin does not appear to be a terminal]' \ {-c,--check}'[syntax check script without executing]' \
'(-p --print)'{-p,--print}'[print result of --eval]' \ '(- 1 *)'{-e,--eval}'[evaluate script]:inline JavaScript' \
'(--vars)--vars[print various compiled-in variables]' \ '(- 1 *)'{-h,--help}'[print node command line options]' \
'*:JS Script:_files -g "*.js"' && ret=0 {-i,--interactive}'[always enter the REPL even if stdin does not appear to be a terminal]' \
'(- 1 *)'{-p,--print}'[evaluate script and print result]:inline JavaScript' \
'*'{-r,--require}'[module to preload (option can be repeated)]: :_node_files' \
'(- 1 *)'{-v,--version}'[print Node.js version]' \
'*: :_node_files' && ret=0
_values 'commands' \
'inspect[enable inspector for debugging]' && ret=0
return ret return ret

View File

@@ -1,58 +0,0 @@
#compdef paste
#
# ------------------------------------------------------------------------------
# Copyright (c) 2016 Github zsh-users - http://github.com/zsh-users
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for paste (from util-linux 2.27.1->)
# (https://git.kernel.org/cgit/utils/util-linux/util-linux.git/)
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Ole Jørgen Brønner <olejorgenb@yahoo.no>
#
# ------------------------------------------------------------------------------
#
# Note: paste has upstream bash completion
# Note: Credit to https://github.com/RobSis/zsh-completion-generator which was
# used to generate most of the script.
local arguments
arguments=(
'*'{-d,--delimiters}'[reuse characters from LIST instead of TABs]:delimiters'
'(-s --serial)'{-s,--serial}'[paste one file at a time instead of in parallel]'
'(-z --zero-terminated)'{-z,--zero-terminated}'[line delimiter is NUL, not newline]'
'--help[display this help and exit]'
'--version[output version information and exit]'
'*:filename:_files'
)
_arguments -s $arguments

View File

@@ -303,7 +303,8 @@ _pg_restore () {
--no-security-labels'[do not restore security labels]' \ --no-security-labels'[do not restore security labels]' \
--no-tablespaces'[do not restore tablespace assignments]' \ --no-tablespaces'[do not restore tablespace assignments]' \
--section=':dump named section:_values "section" pre-data data post-data' \ --section=':dump named section:_values "section" pre-data data post-data' \
--use-set-session-authorization'[use SET SESSION AUTHORIZATION commands instead of ALTER OWNER commands to set ownership]' --use-set-session-authorization'[use SET SESSION AUTHORIZATION commands instead of ALTER OWNER commands to set ownership]' \
"1: :_files"
} }
_pg_dumpall () { _pg_dumpall () {

View File

@@ -119,7 +119,7 @@ _port() {
) )
pseudo_common=(all current active inactive actinact installed uninstalled outdated pseudo_common=(all current active inactive actinact installed uninstalled outdated
obsolete requested unrequested leaves) obsolete requested unrequested leaves rleaves)
pseudo_advanced=('variants:' 'variant:' 'description:' 'depends:' pseudo_advanced=('variants:' 'variant:' 'description:' 'depends:'
'depends_lib:' 'depends_run:' 'depends_build:' 'depends_fetch:' 'depends_extract:' 'depends_lib:' 'depends_run:' 'depends_build:' 'depends_fetch:' 'depends_extract:'
@@ -127,6 +127,7 @@ _port() {
'maintainers:' 'maintainer:' 'categories:' 'category:' 'version:' 'revision:' 'license:') 'maintainers:' 'maintainer:' 'categories:' 'category:' 'version:' 'revision:' 'license:')
select_options=( select_options=(
'--summary:Display sumamry of selected options'
'--list:List available versions for the group' '--list:List available versions for the group'
'--set:Select the given version for the group' '--set:Select the given version for the group'
'--show:Show which version is currently selected for the group (default if none given)' '--show:Show which version is currently selected for the group (default if none given)'
@@ -157,7 +158,7 @@ _port() {
'-F[Read and process the file of commands specified by the argument.]' \ '-F[Read and process the file of commands specified by the argument.]' \
'-p[Despite any errors encountered, proceed to process multiple ports and commands.]' \ '-p[Despite any errors encountered, proceed to process multiple ports and commands.]' \
'-y[Perform a dry run.]' \ '-y[Perform a dry run.]' \
'-t[Enable trace mode debug facilities on platforms that support it (Mac OS X).]' \ '-t[Enable trace mode debug facilities on platforms that support it (macOS).]' \
"1:Port actions:(($actions))" \ "1:Port actions:(($actions))" \
'::Per-action arguments:_port_dispatch' \ '::Per-action arguments:_port_dispatch' \
&& return 0 && return 0

View File

@@ -1,119 +0,0 @@
#compdef rclone
# ------------------------------------------------------------------------------
# Copyright (c) 2016 Github zsh-users - http://github.com/zsh-users
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for rclone (http://rclone.org/).
#
# ------------------------------------------------------------------------------
# Author(s)
# -------
#
# * Rajat Roy <i3wm.debian@gmail.com>
#
# ------------------------------------------------------------------------------
_rclone() {
local state
_arguments \
'1: :->subcommand' \
'--bwlimit' \
'--checkers' \
'(-c --checksum)'{-c,--checksum}'[check the file hash and size to determine if files are equal]' \
'--config' \
'--contimeout' \
'--dedupe-mode' \
'(-n --dry-run)'{-n,--dry-run}'[Do a trial run with no permanent changes]' \
'--ignore-existing' \
'--ignore-size' \
'(-I --ignore-times)'{-I,--ignore-times}'[unconditionally upload all files regardless of the state of files on the destination]' \
'--log-file' \
'--low-level-retries' \
'--max-depth' \
'--modify-window' \
'--no-gzip-encoding' \
'--no-update-modtime' \
'(-q --quiet)'{-q,--quiet}'[as little output as possible]' \
'--retries' \
'--size-only' \
'--stats' \
'--delete-' \
'--timeout' \
'--transfers' \
'(-u --update)'{-u,--update}'[skip any files which exist on the destination and have a modified time that is newer than the source file]' \
'(-v --verbose)'{-v,--verbose}'[tells you about every file it considers and transfers]' \
'--delete-excluded' \
'--filter' \
'--filter-from' \
'-exclude' \
'--exclude-from' \
'--include' \
'--include-from' \
'--files-from' \
'--min-size' \
'--max-size' \
'--min-age' \
'--max-age' \
'--dump-filters' \
'*:files:_files'
case $state in
subcommand)
_arguments '1: :(
config
copy
sync
move
delete
purge
mkdir
rmdir
check
ls
lsd
lsl
md5sum
sha1sum
size
version
cleanup
dedupe
authorize
cat
genautocomplete
gendocs
listremotes
mount
--help)'
;;
esac
}
_rclone "$@"

View File

@@ -48,6 +48,9 @@ _rspec() {
'*'{-O,--options}'[Specify the path to a custom options file]:PATH:_files' \ '*'{-O,--options}'[Specify the path to a custom options file]:PATH:_files' \
--order'[Run examples by the specified order type]: :->order' \ --order'[Run examples by the specified order type]: :->order' \
--seed'[Equivalent of --order rand:SEED]: :_guard "[[\:digit\:]]#" "SEED"' \ --seed'[Equivalent of --order rand:SEED]: :_guard "[[\:digit\:]]#" "SEED"' \
--bisect'[Repeatedly runs the suite in order to isolate the failures to the smallest reproducible case]' \
--only-failures'[Filter to just the examples that failed the last time they ran]' \
'(-n --next-failure)'{-n,--next-failure}'[Apply `--only-failures` and abort after one failure (equivalent to `--only-failures --fail-fast --order defined`)]' \
--fail-fast'[Abort the run on first failure]' \ --fail-fast'[Abort the run on first failure]' \
--no-fail-fast'[Do not abort the run on first failure]' \ --no-fail-fast'[Do not abort the run on first failure]' \
--failure-exit-code'[Override the exit code used when there are failing specs]: :_guard "[[\:digit\:]]#" "CODE"' \ --failure-exit-code'[Override the exit code used when there are failing specs]: :_guard "[[\:digit\:]]#" "CODE"' \
@@ -58,8 +61,10 @@ _rspec() {
'(-o --out)'{-o,--out}'[Write output to a file instead of $stdout]:FILE:_files' \ '(-o --out)'{-o,--out}'[Write output to a file instead of $stdout]:FILE:_files' \
--deprecation-out'[Write deprecation warnings to a file instead of $stderr]:FILE:_files' \ --deprecation-out'[Write deprecation warnings to a file instead of $stderr]:FILE:_files' \
'(-b --backtrace)'{-b,--backtrace}'[Enable full backtrace]' \ '(-b --backtrace)'{-b,--backtrace}'[Enable full backtrace]' \
'(-c --color)'{-c,--color}'[Enable color in the output]' \ --force-color'[Force the output to be in color, even if the output is not a TTY]' \
--no-color'[Force the output to not be in color, even if the output is a TTY]' \
'(-p --profile)'{-p,--profile}'[Enable profiling of examples and list the slowest examples (default: 10)]: :_guard "[[\:digit\:]]#" "COUNT"' \ '(-p --profile)'{-p,--profile}'[Enable profiling of examples and list the slowest examples (default: 10)]: :_guard "[[\:digit\:]]#" "COUNT"' \
--no-profile'[Disable profiling of examples]' \
'(-w --warnings)'{-w,--warnings}'[Enable ruby warnings]' \ '(-w --warnings)'{-w,--warnings}'[Enable ruby warnings]' \
'(-P --pattern)'{-P,--pattern}'[Load files matching pattern (default: "spec/**/*_spec.rb")]:PATTERN:' \ '(-P --pattern)'{-P,--pattern}'[Load files matching pattern (default: "spec/**/*_spec.rb")]:PATTERN:' \
--exclude-pattern'[Load files except those matching pattern]:PATTERN:' \ --exclude-pattern'[Load files except those matching pattern]:PATTERN:' \

View File

@@ -1,81 +0,0 @@
#compdef scl
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for Software Collections (https://www.softwarecollections.org).
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Julien Nicoulaud <julien.nicoulaud@gmail.com>
#
# ------------------------------------------------------------------------------
_scl() {
local context curcontext="$curcontext" state line
typeset -A opt_args
local ret=1
_arguments -C \
'1: :_scl_cmds' \
"(- : *)"{-h,--help}'[display help information]' \
"(- : *)"{-l,--list}'[list installed Software Collections or packages that belong to them]:installed collection:_scl_installed_collections' \
'*::arg:->args' \
&& ret=0
case $state in
(args)
curcontext="${curcontext%:*:*}:scl-cmd-$words[1]:"
case $line[1] in
(register)
_arguments '1:Software Collection path:_files -/' && ret=0
;;
(deregister)
_arguments -C \
'1:: :_scl_installed_collections' \
'--force[force suppression of the collection]' \
&& ret=0
;;
(enable)
_arguments -C \
'1:: :_scl_installed_collections' \
'2:command: _command_names -e' \
&& ret=0
;;
*)
_call_function ret _scl_cmd_$words[1] && ret=0
(( ret )) && _message 'no more arguments'
;;
esac
;;
esac
}
(( $+functions[_scl_cmds] )) ||
_scl_cmds() {
local commands; commands=(
'register:register a Software Collection'
'deregister:deregister a Software Collection'
'enable:enable a Software Collection'
)
_describe -t commands 'scl command' commands "$@"
}
(( $+functions[_scl_installed_collections] )) ||
_scl_installed_collections() {
local installed_collections; installed_collections=($(_call_program installed_collections $service --list))
_describe -t collections 'Software Collection' installed_collections "$@"
}
_scl "$@"
# Local Variables:
# mode: Shell-Script
# sh-indentation: 2
# indent-tabs-mode: nil
# sh-basic-offset: 2
# End:
# vim: ft=zsh sw=2 ts=2 et

View File

@@ -0,0 +1,72 @@
#compdef scons
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for SCons 2.5.1 (http://scons.org/)
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Oliver Kiddle <okiddle@yahoo.co.uk>
#
# ------------------------------------------------------------------------------
_arguments -s -S \
'(-c --clean --remove)-'{c,-clean,-remove}'[remove specified targets and dependencies]' \
'(-C --directory)-'{C,-directory=}'[change to specified directory first]:directory:_directories' \
'--cache-debug=[print CacheDir debug info to file]:file:_files' \
'(--cache-disable --no-cache)--'{cache-disable,no-cache}"[don't retrieve built targets from cache]" \
'(--cache-force --cache-populate)--'{cache-force,cache-populate}'[copy already-built targets into cache]' \
"--cache-readonly[don't update CacheDir with built targets]" \
'--cache-show[print build actions for files from cache]' \
'--config=[set Configure mode]:mode:(auto force cache)' \
'(-u --up --search-up -U)-D[search up for SConstruct; build default targets]' \
'--debug=[print debugging information]:type:(
count duplicate explain findlibs includes memoizer memory objects pdb prepare presub stacktrace time)' \
'--diskcheck=[enable specific on-disk checks]:check:(all none match rcs sccs)' \
'--duplicate=[set preferred file duplication methods]:file duplication methods:(
hard-soft-copy soft-hard-copy hard-copy soft-copy copy)' \
'(-f --file --makefile --sconstruct)-'{f,-file=,-makefile=,-sconstruct=}'[specify SConstruct file]:file:_files' \
'(-)-'{h,-help}'[display defined usage information]' \
'(-)-'{H,-help-options}'[display usage information]' \
'(-i -ignore-errors)-'{i,-ignore-errors}'[ignore errors from build actions]' \
\*{-I,--include-dir=}'[add directory to search Python modules]:directories:_directories' \
'(--implicit-deps-changed --implicit-deps-unchanged)--implicit-cache[cache scanned dependencies]' \
'(--implicit-cache --implicit-deps-changed)--implicit-deps-changed[rescan dependencies]' \
'(--implicit-cache --implicit-deps-unchanged)--implicit-deps-unchanged[ignore changes to scanned dependencies]' \
'--interactive[start interactive mode]' \
'(-j --jobs)-'{j,jobs=}'[specify no of jobs to run in parallel]' \
'(-k --keep-going)-'{k,-keep-going}'[continue after an error]' \
'--max-drift=[set the maximum clock drift]:drift (seconds)' \
'--md5-chunksize=[set chunksize for MD5 signature computation]:size (kB)' \
'(-n --just-print --dry-run --recon)-'{n,-just-print,-dry-run,-recon}"[print commands but don't run them]" \
"--no-site-dir[don't use the usual site_scons directory]" \
'--profile=[profile scons]:output file:_files' \
'(-q --question)-'{q,question}'[query whether up-to-date]' \
'-Q[suppress progress messages]' \
'--random[build dependencies in random order]' \
'(-s --silent --quiet)-'{s,-silent,-quiet}"[don't print commands]" \
'--site-dir=[specify site_scons directory]:directory:_directories' \
'--stack-size[set stacksize of threads]:size (kB)' \
'--taskmastertrace=[trace node evaluation to file]:file' \
'--tree=[print dependency tree]:format:(all derived prune status)' \
'(-u --up --search-up -D -U)-'{u,-up,-search-up}'[search up for SConstruct; build current directory]' \
'(-u --up --search-up -D)-U[search up for SConstruct; build Default targets]' \
'(-)-'{v,-version}'[print version information]' \
\*{--warn=,--warning=}'[enable or disable warnings]:type:(
all cache-write-error corrupt-sconsign dependency deprecated
deprecated-copy deprecated-source-signatures deprecated-target-signatures
duplicate-environment fortran-cxx-mix link misleading-keywords
missing-sconscript no-md5-module no-metaclass-support no-object-count
no-parallel-support python-version reserved-variable stack-size no-all
no-cache-write-error no-corrupt-sconsign no-dependency no-deprecated
no-deprecated-copy no-deprecated-source-signatures
no-deprecated-target-signatures no-duplicate-environment
no-fortran-cxx-mix no-link no-misleading-keywords no-missing-sconscript
no-no-md5-module no-no-metaclass-support no-no-object-count
no-no-parallel-support no-python-version no-reserved-variable
no-stack-size)' \
\*{-Y,--repository}'[search specified repository for files]:repository:_files' \
'*:target:_default' # Doesn't seem to be a simple way to get a list of targets

View File

@@ -0,0 +1,935 @@
#compdef sfdx
# DESCRIPTION: Zsh completion script for the Salesforce CLI
# AUTHOR: Wade Wegner (@WadeWegner)
# REPO: https://github.com/wadewegner/salesforce-cli-zsh-completion
# LICENSE: https://github.com/wadewegner/salesforce-cli-zsh-completion/blob/master/LICENSE
local -a _1st_arguments
_1st_arguments=(
"force\:alias\:list":"list username aliases for the Salesforce CLI"
"force\:alias\:set":"set username aliases for the Salesforce CLI"
"force\:apex\:class\:create":"create an Apex class"
"force\:apex\:execute":"execute anonymous Apex code"
"force\:apex\:log\:get":"fetch a debug log"
"force\:apex\:log\:list":"list debug logs"
"force\:apex\:test\:report":"display test results"
"force\:apex\:test\:run":"invoke Apex tests"
"force\:apex\:trigger\:create":"create an Apex trigger"
"force\:auth\:jwt\:grant":"authorize an org using the JWT flow"
"force\:auth\:sfdxurl\:store":"authorize an org using an SFDX auth URL"
"force\:auth\:web\:login":"authorize an org using the web login flow"
"force\:config\:get":"get config var values for given names"
"force\:config\:list":"list config vars for the Salesforce CLI"
"force\:config\:set":"set config vars for the Salesforce CLI"
"force\:data\:bulk\:delete":"bulk delete records from a csv file"
"force\:data\:bulk\:status":"view the status of a bulk data load job or batch"
"force\:data\:bulk\:upsert":"bulk upsert records from a CSV file"
"force\:data\:record\:create":"create a record"
"force\:data\:record\:delete":"delete a record"
"force\:data\:record\:get":"view a record"
"force\:data\:record\:update":"update a record"
"force\:data\:soql\:query":"execute a SOQL query"
"force\:data\:tree\:export":"export data from an org into sObject tree format for force:data:tree:import consumption"
"force\:data\:tree\:import":"import data into an org using SObject Tree Save API"
"force\:doc\:commands\:display":"display help for force commands"
"force\:doc\:commands\:list":"list the force commands"
"force\:lightning\:app\:create":"create a Lightning app"
"force\:lightning\:component\:create":"create a Lightning component"
"force\:lightning\:event\:create":"create a Lightning event"
"force\:lightning\:interface\:create":"create a Lightning interface"
"force\:lightning\:lint":"analyse (lint) Lightning component code"
"force\:lightning\:test\:create":"create a Lightning test"
"force\:lightning\:test\:install":"install Lightning Testing Service unmanaged package in your org"
"force\:lightning\:test\:run":"invoke Lightning component tests"
"force\:limits\:api\:display":"display current orgs limits"
"force\:mdapi\:convert":"convert Metadata API source into the Salesforce DX source format"
"force\:mdapi\:deploy":"deploy metadata to an org using Metadata API"
"force\:mdapi\:deploy\:report":"check the status of a metadata deployment"
"force\:mdapi\:retrieve":"retrieve metadata from an org using Metadata API"
"force\:mdapi\:retrieve\:report":"check the status of a metadata retrieval"
"force\:org\:create":"create a scratch org"
"force\:org\:delete":"mark a scratch org for deletion"
"force\:org\:display":"get org description"
"force\:org\:list":"list all orgs youve created or authenticated to"
"force\:org\:open":"open an org in your browser"
"force\:org\:shape\:create":"create a snapshot of org edition, features, and licenses"
"force\:org\:shape\:delete":"delete all org shapes for a target org"
"force\:org\:shape\:list":"list all org shapes youve created"
"force\:package1\:version\:create":"create a first-generation package version in the release org"
"force\:package1\:version\:create\:get":"retrieve the status of a package version creation request"
"force\:package1\:version\:display":"display details about a first-generation package version"
"force\:package1\:version\:list":"list package versions for the specified first-generation package or for the org"
"force\:package2\:create":"create a second-generation package"
"force\:package2\:list":"list all second-generation packages in the Dev Hub org"
"force\:package2\:update":"update a second-generation package"
"force\:package2\:version\:create":"create a second-generation package version"
"force\:package2\:version\:create\:get":"retrieve a package version creation request"
"force\:package2\:version\:create\:list":"list package version creation requests"
"force\:package2\:version\:get":"retrieve a package version in the Dev Hub org"
"force\:package2\:version\:list":"list all package versions in the Dev Hub org"
"force\:package2\:version\:update":"update a second-generation package version"
"force\:package\:install":"install a package in the target org"
"force\:package\:install\:get":"retrieve the status of a package installation request"
"force\:package\:installed\:list":"list the orgs installed packages"
"force\:package\:uninstall":"uninstall a second-generation package from the target org"
"force\:package\:uninstall\:get":"retrieve status of package uninstall request"
"force\:project\:create":"create a new SFDX project"
"force\:project\:upgrade":"update project config files to the latest format"
"force\:schema\:sobject\:describe":"describe an object"
"force\:schema\:sobject\:list":"list all objects of a specified category"
"force\:source\:convert":"convert Salesforce DX source into the Metadata API source format"
"force\:source\:open":"edit a Lightning Page with Lightning App Builder"
"force\:source\:pull":"pull source from the scratch org to the project"
"force\:source\:push":"push source to an org from the project"
"force\:source\:status":"list local changes and/or changes in a scratch org"
"force\:user\:create":"create a user for a scratch org"
"force\:user\:display":"displays information about a user of a scratch org"
"force\:user\:list":"lists all users of a scratch org"
"force\:user\:password\:generate":"generate a password for scratch org users"
"force\:user\:permset\:assign":"assign a permission set to one or more users of an org"
"force\:visualforce\:component\:create":"create a Visualforce component"
"force\:visualforce\:page\:create":"create a Visualforce page"
)
_arguments '*:: :->command'
if (( CURRENT == 1 )); then
_describe -t commands "sfdx command" _1st_arguments
return
fi
local -a _command_args
case "$words[1]" in
force:limits:api:display)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:app:create)
_command_args=(
'(-n|--appname)'{-n,--appname}'[name of the generated Lightning app]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultLightningApp*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:bulk:delete)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the sObject type of the records youre deleting]' \
'(-f|--csvfile)'{-f,--csvfile}'[the path to the CSV file containing the ids of the records to delete]:file:_files' \
'(-w|--wait)'{-w,--wait}'[the number of minutes to wait for the command to complete before displaying the results]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:bulk:status)
_command_args=(
'(-i|--jobid)'{-i,--jobid}'[the ID of the job you want to view or of the job whose batch you want to view]' \
'(-b|--batchid)'{-b,--batchid}'[the ID of the batch whose status you want to view]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:bulk:upsert)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the sObject type of the records you want to upsert]' \
'(-f|--csvfile)'{-f,--csvfile}'[the path to the CSV file that defines the records to upsert]:file:_files' \
'(-i|--externalid)'{-i,--externalid}'[the column name of the external ID; if not provided, an arbitrary ID is used]' \
'(-w|--wait)'{-w,--wait}'[the number of minutes to wait for the command to complete before displaying the results]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:apex:class:create)
_command_args=(
'(-n|--classname)'{-n,--classname}'[name of the generated Apex class]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultApexClass*,ApexException,ApexUnitTest,InboundEmailService)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:doc:commands:display)
_command_args=(
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:doc:commands:list)
_command_args=(
'(-u|--usage)'{-u,--usage}'[list only docopts usage strings]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:visualforce:component:create)
_command_args=(
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultVFComponent*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-n|--componentname)'{-n,--componentname}'[name of the generated Visualforce component]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(-l|--label)'{-l,--label}'[Visualforce component label]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:component:create)
_command_args=(
'(-n|--componentname)'{-n,--componentname}'[name of the generated Lightning component]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultLightningCmp*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:mdapi:convert)
_command_args=(
'(-r|--rootdir)'{-r,--rootdir}'[the root directory containing the Metadata API source]:file:_files' \
'(-d|--outputdir)'{-d,--outputdir}'[the output directory to store the sfdx source]:file:_files' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:source:convert)
_command_args=(
'(-r|--rootdir)'{-r,--rootdir}'[the source directory for the source to be converted]:file:_files' \
'(-d|--outputdir)'{-d,--outputdir}'[the output directory to export the Metadata API source to]:file:_files' \
'(-n|--packagename)'{-n,--packagename}'[the name of the package to associate with the Metadata API source]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:create)
_command_args=(
'(-f|--definitionfile)'{-f,--definitionfile}'[path to a scratch org definition file]:file:_files' \
'(-j|--definitionjson)'{-j,--definitionjson}'[scratch org definition in json format ]' \
'(-n|--nonamespace)'{-n,--nonamespace}'[creates the scratch org with no namespace]' \
'(-c|--noancestors)'{-c,--noancestors}'[do not include second-generation package ancestors in the scratch org]' \
'(-i|--clientid)'{-i,--clientid}'[connected app consumer key]' \
'(-s|--setdefaultusername)'{-s,--setdefaultusername}'[set the created org as the default username]' \
'(-a|--setalias)'{-a,--setalias}'[set an alias for for the created scratch org]' \
'(-e|--env)'{-e,--env}'[environment where the scratch org is created: \[sandbox*,virtual,prototype\] (sandbox*,virtual,prototype)]' \
'(-w|--wait)'{-w,--wait}'[the streaming client socket timeout (in minutes) (default:6, min:2)]' \
'(-d|--durationdays)'{-d,--durationdays}'[duration of the scratch org (in days) (default:7, min:1, max:30)]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:create)
_command_args=(
'(-n|--name)'{-n,--name}'[package name]' \
'(-o|--containeroptions)'{-o,--containeroptions}'[\[*Managed | Unlocked | Locked\] container options for the package (Managed=DeveloperManagedSubscriberManaged, Unlocked=DeveloperControlledSubscriberEditable, Locked=DeveloperControlledSubscriberLocked)]' \
'(-d|--description)'{-d,--description}'[package description]' \
'(-e|--nonamespace)'{-e,--nonamespace}'[creates the package with no namespace; available only for developer-controlled packages.]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:user:create)
_command_args=(
'(-f|--definitionfile)'{-f,--definitionfile}'[file path to a user definition]:file:_files' \
'(-a|--setalias)'{-a,--setalias}'[set an alias for the created username to reference within the CLI]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:project:create)
_command_args=(
'(-n|--projectname)'{-n,--projectname}'[name of the generated project]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (Defaultsfdx-project.json*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-l|--loginurl)'{-l,--loginurl}'[Salesforce instance login URL (https://login.salesforce.com*)]' \
'(-x|--sourceapiversion)'{-x,--sourceapiversion}'[source API version number (41.0*)]' \
'(-s|--namespace)'{-s,--namespace}'[project associated namespace]' \
'(-p|--defaultpackagedir)'{-p,--defaultpackagedir}'[default package directory name (force-app*)]:file:_files' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:delete)
_command_args=(
'(-p|--noprompt)'{-p,--noprompt}'[no prompt to confirm deletion]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:mdapi:deploy)
_command_args=(
'(-c|--checkonly)'{-c,--checkonly}'[validate deploy but dont save to the org (default:false)]' \
'(-d|--deploydir)'{-d,--deploydir}'[root of directory tree of files to deploy]:file:_files' \
'(-w|--wait)'{-w,--wait}'[wait time for command to finish in minutes (default: 0)]' \
'(-i|--jobid)'{-i,--jobid}'[WARNING: The flag "jobid" has been deprecated and will be removed in v41.01.0 or later. Instead, use "sfdx force:mdapi:deploy:report -i <jobId>".]' \
'(-l|--testlevel)'{-l,--testlevel}'[deployment testing level (NoTestRun,RunSpecifiedTests,RunLocalTests,RunAllTestsInOrg)]' \
'(-r|--runtests)'{-r,--runtests}'[tests to run if --testlevel RunSpecifiedTests]' \
'(-e|--rollbackonerror)'{-e,--rollbackonerror}'[WARNING: The flag "rollbackonerror" has been deprecated and will be removed in v41.01.0 or later. Instead, use "ignoreerrors".]' \
'(-o|--ignoreerrors)'{-o,--ignoreerrors}'[ignore any errors and do not roll back deployment (default:false)]' \
'(-g|--ignorewarnings)'{-g,--ignorewarnings}'[whether a warning will allow a deployment to complete successfully (default:false)]' \
'(-f|--zipfile)'{-f,--zipfile}'[path to .zip file of metadata to deploy]:file:_files' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[verbose output of deploy results]' \
)
;;
force:mdapi:deploy:report)
_command_args=(
'(-w|--wait)'{-w,--wait}'[wait time for command to finish in minutes (default: 0)]' \
'(-i|--jobid)'{-i,--jobid}'[job ID of the deployment you want to check]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[verbose output of deploy results]' \
)
;;
force:org:display)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[emit additional command output to stdout]' \
)
;;
force:user:display)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:event:create)
_command_args=(
'(-n|--eventname)'{-n,--eventname}'[name of the generated Lightning event]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultLightningEvt*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:apex:execute)
_command_args=(
'(-f|--apexcodefile)'{-f,--apexcodefile}'[path to a local file containing Apex code]:file:_files' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:config:get)
_command_args=(
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[emit additional command output to stdout]' \
)
;;
force:package:install)
_command_args=(
'(-i|--id)'{-i,--id}'[ID of the package version to install (starts with 04t)]' \
'(-w|--wait)'{-w,--wait}'[number of minutes to wait for installation status]' \
'(-k|--installationkey)'{-k,--installationkey}'[installation key for key-protected package (default: null)]' \
'(-p|--publishwait)'{-p,--publishwait}'[number of minutes to wait for subscriber package version ID to become available in the target org]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package:install:get)
_command_args=(
'(-i|--requestid)'{-i,--requestid}'[ID of the package install request you want to check]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package:installed:list)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:interface:create)
_command_args=(
'(-n|--interfacename)'{-n,--interfacename}'[name of the generated Lightning interface]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultLightningIntf*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:auth:jwt:grant)
_command_args=(
'(-u|--username)'{-u,--username}'[authentication username]' \
'(-f|--jwtkeyfile)'{-f,--jwtkeyfile}'[path to a file containing the private key]:file:_files' \
'(-i|--clientid)'{-i,--clientid}'[OAuth client ID (sometimes called the consumer key)]' \
'(-r|--instanceurl)'{-r,--instanceurl}'[the login URL of the instance the org lives on]' \
'(-d|--setdefaultdevhubusername)'{-d,--setdefaultdevhubusername}'[set the authenticated org as the default dev hub org for scratch org creation]' \
'(-s|--setdefaultusername)'{-s,--setdefaultusername}'[set the authenticated org as the default username that all commands run against]' \
'(-a|--setalias)'{-a,--setalias}'[set an alias for the authenticated org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:lint)
_command_args=(
'(-i|--ignore)'{-i,--ignore}'[pattern used to ignore some folders]' \
'(--files)--files[pattern used to include specific files]:file:_files' \
'(-j|--json)'{-j,--json}'[format output as JSON]' \
'(--config)--config[path to a custom ESLint configuration file]:file:_files' \
'(--verbose)--verbose[report warnings in addition to errors]' \
'(--exit)--exit[exit with error code 1 if there are lint issues]' \
)
;;
force:alias:list)
_command_args=(
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:config:list)
_command_args=(
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:list)
_command_args=(
'(--all)--all[include expired, deleted, and unknown-status scratch orgs]' \
'(--clean)--clean[remove all local org authorizations for non-active orgs]' \
'(-p|--noprompt)'{-p,--noprompt}'[do not prompt for confirmation]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[list more information about each org]' \
)
;;
force:package2:list)
_command_args=(
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:user:list)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:apex:log:get)
_command_args=(
'(-i|--logid)'{-i,--logid}'[ID of the log to display]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:apex:log:list)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:open)
_command_args=(
'(-p|--path)'{-p,--path}'[navigation URL path]:file:_files' \
'(-r|--urlonly)'{-r,--urlonly}'[display navigation URL, but dont launch browser]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:source:open)
_command_args=(
'(-f|--sourcefile)'{-f,--sourcefile}'[file to edit]:file:_files' \
'(-r|--urlonly)'{-r,--urlonly}'[generate a navigation URL; dont launch the editor]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:visualforce:page:create)
_command_args=(
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultVFPage*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-n|--pagename)'{-n,--pagename}'[name of the generated Visualforce page]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(-l|--label)'{-l,--label}'[Visualforce page label]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:user:password:generate)
_command_args=(
'(-o|--onbehalfof)'{-o,--onbehalfof}'[comma-separated list of usernames for which to generate passwords]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:user:permset:assign)
_command_args=(
'(-n|--permsetname)'{-n,--permsetname}'[the name of the permission set to assign]' \
'(-o|--onbehalfof)'{-o,--onbehalfof}'[comma-separated list of usernames or aliases to assign the permission set to]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:source:pull)
_command_args=(
'(-w|--wait)'{-w,--wait}'[wait time for command to finish in minutes (default: 33) (default:33, min:1)]' \
'(-f|--forceoverwrite)'{-f,--forceoverwrite}'[ignore conflict warnings and overwrite changes to the project]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:source:push)
_command_args=(
'(-f|--forceoverwrite)'{-f,--forceoverwrite}'[ignore conflict warnings and overwrite changes to scratch org]' \
'(-g|--ignorewarnings)'{-g,--ignorewarnings}'[deploy changes even if warnings are generated]' \
'(-w|--wait)'{-w,--wait}'[wait time for command to finish in minutes (default: 33) (default:33, min:1)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:record:create)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the type of the record youre creating]' \
'(-v|--values)'{-v,--values}'[the <fieldName>=<value> pairs youre creating]' \
'(-t|--usetoolingapi)'{-t,--usetoolingapi}'[create the record with tooling api]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:record:delete)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the type of the record youre deleting]' \
'(-i|--sobjectid)'{-i,--sobjectid}'[the ID of the record youre deleting]' \
'(-w|--where)'{-w,--where}'[a list of <fieldName>=<value> pairs to search for]' \
'(-t|--usetoolingapi)'{-t,--usetoolingapi}'[delete the record with Tooling API]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:record:get)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the type of the record youre retrieving]' \
'(-i|--sobjectid)'{-i,--sobjectid}'[the ID of the record youre retrieving]' \
'(-w|--where)'{-w,--where}'[a list of <fieldName>=<value> pairs to search for]' \
'(-t|--usetoolingapi)'{-t,--usetoolingapi}'[retrieve the record with Tooling API]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:record:update)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the type of the record youre updating]' \
'(-i|--sobjectid)'{-i,--sobjectid}'[the ID of the record youre updating]' \
'(-w|--where)'{-w,--where}'[a list of <fieldName>=<value> pairs to search for]' \
'(-v|--values)'{-v,--values}'[the <fieldName>=<value> pairs youre updating]' \
'(-t|--usetoolingapi)'{-t,--usetoolingapi}'[update the record with Tooling API]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:mdapi:retrieve)
_command_args=(
'(-a|--apiversion)'{-a,--apiversion}'[target API version for the retrieve (default 41.0)]' \
'(-w|--wait)'{-w,--wait}'[wait time for command to finish in minutes (default: -1 (no limit))]' \
'(-r|--retrievetargetdir)'{-r,--retrievetargetdir}'[directory root for the retrieved files]:file:_files' \
'(-k|--unpackaged)'{-k,--unpackaged}'[file path of manifest of components to retrieve]:file:_files' \
'(-d|--sourcedir)'{-d,--sourcedir}'[source dir to use instead of default manifest sfdx-project.xml]' \
'(-p|--packagenames)'{-p,--packagenames}'[a comma-separated list of packages to retrieve]' \
'(-s|--singlepackage)'{-s,--singlepackage}'[a single-package retrieve (default: false)]' \
'(-i|--jobid)'{-i,--jobid}'[WARNING: The flag "jobid" has been deprecated and will be removed in v41.01.0 or later. Instead, use "sfdx force:mdapi:retrieve:report -i <jobId> -r <targetDir>".]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[verbose output of retrieve result]' \
)
;;
force:mdapi:retrieve:report)
_command_args=(
'(-w|--wait)'{-w,--wait}'[wait time for command to finish in minutes (default: -1 (no limit))]' \
'(-r|--retrievetargetdir)'{-r,--retrievetargetdir}'[directory root for the retrieved files]:file:_files' \
'(-i|--jobid)'{-i,--jobid}'[job ID of the retrieve you want to check]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[verbose output of retrieve result]' \
)
;;
force:alias:set)
_command_args=(
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:config:set)
_command_args=(
'(-g|--global)'{-g,--global}'[set config var globally (to be used from any directory)]:file:_files' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:auth:sfdxurl:store)
_command_args=(
'(-f|--sfdxurlfile)'{-f,--sfdxurlfile}'[path to a file containing the sfdx url]:file:_files' \
'(-d|--setdefaultdevhubusername)'{-d,--setdefaultdevhubusername}'[set the authenticated org as the default dev hub org for scratch org creation]' \
'(-s|--setdefaultusername)'{-s,--setdefaultusername}'[set the authenticated org as the default username that all commands run against]' \
'(-a|--setalias)'{-a,--setalias}'[set an alias for the authenticated org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:shape:create)
_command_args=(
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:shape:delete)
_command_args=(
'(-p|--noprompt)'{-p,--noprompt}'[do not prompt for confirmation]' \
'(-u|--targetusername)'{-u,--targetusername}'[username for the target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:org:shape:list)
_command_args=(
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[list more information about each org shape]' \
)
;;
force:schema:sobject:describe)
_command_args=(
'(-s|--sobjecttype)'{-s,--sobjecttype}'[the API name of the object to describe]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:schema:sobject:list)
_command_args=(
'(-c|--sobjecttypecategory)'{-c,--sobjecttypecategory}'[the type of objects to list (all|custom|standard)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:soql:query)
_command_args=(
'(-q|--query)'{-q,--query}'[SOQL query to execute]' \
'(-t|--usetoolingapi)'{-t,--usetoolingapi}'[execute query with Tooling API]' \
'(-r|--resultformat)'{-r,--resultformat}'[query result format emitted to stdout; --json flag overrides this parameter (human*,csv,json)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:source:status)
_command_args=(
'(-a|--all)'{-a,--all}'[list all the changes that have been made]' \
'(-l|--local)'{-l,--local}'[list the changes that have been made locally]' \
'(-r|--remote)'{-r,--remote}'[list the changes that have been made in the scratch org]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:test:create)
_command_args=(
'(-n|--testname)'{-n,--testname}'[name of the generated Lightning test]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (DefaultLightningTest*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:lightning:test:install)
_command_args=(
'(-w|--wait)'{-w,--wait}'[number of minutes to wait for installation status (default:2)]' \
'(-r|--releaseversion)'{-r,--releaseversion}'[release version of Lightning Testing Service (default:latest)]' \
'(-t|--packagetype)'{-t,--packagetype}'[type of unmanaged package. 'full' option contains both jasmine and mocha, plus examples (full*,jasmine,mocha)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:apex:test:report)
_command_args=(
'(-i|--testrunid)'{-i,--testrunid}'[ID of test run]' \
'(-c|--codecoverage)'{-c,--codecoverage}'[retrieve code coverage results]' \
'(-d|--outputdir)'{-d,--outputdir}'[directory to store test run files]:file:_files' \
'(-r|--resultformat)'{-r,--resultformat}'[test result format emitted to stdout; --json flag overrides this parameter (human*,tap,junit,json)]' \
'(-w|--wait)'{-w,--wait}'[the streaming client socket timeout (in minutes) (default:6, min:2)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[display Apex test processing details]' \
)
;;
force:apex:test:run)
_command_args=(
'(-n|--classnames)'{-n,--classnames}'[comma-separated list of Apex test class names to execute]' \
'(-s|--suitenames)'{-s,--suitenames}'[comma-separated list of Apex test suite names to execute]' \
'(-c|--codecoverage)'{-c,--codecoverage}'[retrieve code coverage results]' \
'(-d|--outputdir)'{-d,--outputdir}'[directory to store test run files]:file:_files' \
'(-l|--testlevel)'{-l,--testlevel}'[testlevel enum value (RunLocalTests,RunAllTestsInOrg,RunSpecifiedTests)]' \
'(-r|--resultformat)'{-r,--resultformat}'[test result format emitted to stdout; --json flag overrides this parameter (human*,tap,junit,json)]' \
'(-w|--wait)'{-w,--wait}'[the streaming client socket timeout (in minutes) (default:6, min:2)]' \
'(--precompilewait)--precompilewait[how long to wait (in minutes) for Apex pre-compilation (default:3, min:3)]' \
'(-y|--synchronous)'{-y,--synchronous}'[run tests from a single class synchronously]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[display Apex test processing details]' \
)
;;
force:lightning:test:run)
_command_args=(
'(-a|--appname)'{-a,--appname}'[name of your Lightning test application]' \
'(-d|--outputdir)'{-d,--outputdir}'[directory path to store test run artifacts: for example, log files and test results]:file:_files' \
'(-r|--resultformat)'{-r,--resultformat}'[test result format emitted to stdout; --json flag overrides this parameter (human*,tap,junit,json)]' \
'(-f|--configfile)'{-f,--configfile}'[path to config file for the test]:file:_files' \
'(-o|--leavebrowseropen)'{-o,--leavebrowseropen}'[leave browser open]' \
'(-t|--timeout)'{-t,--timeout}'[time (ms) to wait for results element in dom (default:60000)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:tree:export)
_command_args=(
'(-q|--query)'{-q,--query}'[soql query, or filepath of file containing a soql query, to retrieve records]:file:_files' \
'(-p|--plan)'{-p,--plan}'[generate mulitple sobject tree files and a plan definition file for aggregated import]' \
'(-x|--prefix)'{-x,--prefix}'[prefix of generated files]' \
'(-d|--outputdir)'{-d,--outputdir}'[directory to store files]:file:_files' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:data:tree:import)
_command_args=(
'(-f|--sobjecttreefiles)'{-f,--sobjecttreefiles}'[comma-delimited, ordered paths of json files containing collection of record trees to insert]:file:_files' \
'(-p|--plan)'{-p,--plan}'[path to plan to insert multiple data files that have master-detail relationships]:file:_files' \
'(-c|--contenttype)'{-c,--contenttype}'[if data file extension is not .json, provide content type (applies to all files)]' \
'(--confighelp)--confighelp[display schema information for the --plan configuration file to stdout; if you use this option, all other options except --json are ignored]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:apex:trigger:create)
_command_args=(
'(-n|--triggername)'{-n,--triggername}'[name of the generated Apex trigger]' \
'(-t|--template)'{-t,--template}'[template to use for file creation (ApexTrigger*)]' \
'(-d|--outputdir)'{-d,--outputdir}'[folder for saving the created files]' \
'(-r|--reflect)'{-r,--reflect}'[switch to return flag detailed information]' \
'(-a|--apiversion)'{-a,--apiversion}'[API version number (41.0*,40.0)]' \
'(-s|--sobject)'{-s,--sobject}'[sObject to create a trigger on (SOBJECT*)]' \
'(-e|--triggerevents)'{-e,--triggerevents}'[events that fire the trigger (before insert*,before upsert,before delete,after insert,after upsert,after delete,after undelete)]' \
'(--json)--json[JSON output]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package:uninstall)
_command_args=(
'(-i|--id)'{-i,--id}'[ID of the package to uninstall (starts with 04t)]' \
'(-w|--wait)'{-w,--wait}'[number of minutes to wait for uninstall status]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package:uninstall:get)
_command_args=(
'(-i|--requestid)'{-i,--requestid}'[ID of the package uninstall request you want to check]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:update)
_command_args=(
'(-i|--package2id)'{-i,--package2id}'[id of the package (starts with 0Ho)]' \
'(-n|--name)'{-n,--name}'[package name]' \
'(-d|--description)'{-d,--description}'[package description]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:project:upgrade)
_command_args=(
'(-f|--forceupgrade)'{-f,--forceupgrade}'[run all upgrades even if project has already been upgraded]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package1:version:create)
_command_args=(
'(-i|--packageid)'{-i,--packageid}'[ID of the metadata package (starts with 033) of which youre creating a new version]' \
'(-n|--name)'{-n,--name}'[package version name]' \
'(-d|--description)'{-d,--description}'[package version description]' \
'(-v|--version)'{-v,--version}'[package version in major.minor format, for example, 3.2]' \
'(-m|--managedreleased)'{-m,--managedreleased}'[create a managed package version]' \
'(-r|--releasenotesurl)'{-r,--releasenotesurl}'[release notes URL]' \
'(-p|--postinstallurl)'{-p,--postinstallurl}'[post install URL]' \
'(-k|--installationkey)'{-k,--installationkey}'[installation key for key-protected package (default: null)]' \
'(-w|--wait)'{-w,--wait}'[minutes to wait for the package version to be created (default: 2 minutes)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:version:create)
_command_args=(
'(-i|--package2id)'{-i,--package2id}'[ID of the parent package (starts with 0Ho)]' \
'(-d|--directory)'{-d,--directory}'[path to directory that contains the contents of the package version]:file:_files' \
'(-b|--branch)'{-b,--branch}'[the package versions branch]' \
'(-t|--tag)'{-t,--tag}'[the package versions tag]' \
'(-k|--installationkey)'{-k,--installationkey}'[installation key for key-protected package (default: null)]' \
'(-p|--preserve)'{-p,--preserve}'[temp files are preserved that would otherwise be deleted]' \
'(-j|--validateschema)'{-j,--validateschema}'[sfdx-project.json is validated against JSON schema]' \
'(-w|--wait)'{-w,--wait}'[minutes to wait for the package version to be created (default:0)]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package1:version:create:get)
_command_args=(
'(-i|--requestid)'{-i,--requestid}'[PackageUploadRequest ID]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:version:create:get)
_command_args=(
'(-i|--package2createrequestid)'{-i,--package2createrequestid}'[package2 version creation request ID (starts with 08c)]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:version:create:list)
_command_args=(
'(-c|--createdlastdays)'{-c,--createdlastdays}'[created in the last specified number of days (starting at 00:00:00 of first day to now; 0 for today)]' \
'(-s|--status)'{-s,--status}'[filter the list by version creation request status (Queued,InProgress,Success,Error)]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package1:version:display)
_command_args=(
'(-i|--packageversionid)'{-i,--packageversionid}'[metadata package version ID (starts with 04t)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:version:get)
_command_args=(
'(-i|--package2versionid)'{-i,--package2versionid}'[the package version ID (starts wtih 05i)]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package1:version:list)
_command_args=(
'(-i|--packageid)'{-i,--packageid}'[metadata package ID (starts with 033)]' \
'(-u|--targetusername)'{-u,--targetusername}'[username or alias for the target org; overrides default target org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:package2:version:list)
_command_args=(
'(-c|--createdlastdays)'{-c,--createdlastdays}'[created in the last specified number of days (starting at 00:00:00 of first day to now; 0 for today)]' \
'(-m|--modifiedlastdays)'{-m,--modifiedlastdays}'[list items modified in the specified last number of days (starting at 00:00:00 of first day to now; 0 for today)]' \
'(-i|--package2ids)'{-i,--package2ids}'[filter results on specified comma-delimited package2 ids (start with 0Ho)]' \
'(-r|--released)'{-r,--released}'[display released versions only]' \
'(-o|--orderby)'{-o,--orderby}'[order by the specified package2 version fields]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--concise)--concise[display limited package2 version details]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
'(--verbose)--verbose[display extended package2 versions detail]' \
)
;;
force:package2:version:update)
_command_args=(
'(-i|--package2versionid)'{-i,--package2versionid}'[the package version ID (starts wtih 05i)]' \
'(-n|--name)'{-n,--name}'[the package version name]' \
'(-d|--description)'{-d,--description}'[the package version description]' \
'(-b|--branch)'{-b,--branch}'[the package version branch]' \
'(-t|--tag)'{-t,--tag}'[the package version tag]' \
'(-k|--installationkey)'{-k,--installationkey}'[installation key for key-protected package (default: null)]' \
'(-s|--setasreleased)'{-s,--setasreleased}'[set the package version as released (cant be undone)]' \
'(-v|--targetdevhubusername)'{-v,--targetdevhubusername}'[username or alias for the dev hub org; overrides default dev hub org]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
force:auth:web:login)
_command_args=(
'(-i|--clientid)'{-i,--clientid}'[OAuth client ID (sometimes called the consumer key)]' \
'(-r|--instanceurl)'{-r,--instanceurl}'[the login URL of the instance the org lives on]' \
'(-d|--setdefaultdevhubusername)'{-d,--setdefaultdevhubusername}'[set the authenticated org as the default dev hub org for scratch org creation]' \
'(-s|--setdefaultusername)'{-s,--setdefaultusername}'[set the authenticated org as the default username that all commands run against]' \
'(-a|--setalias)'{-a,--setalias}'[set an alias for the authenticated org]' \
'(--disablemasking)--disablemasking[disable masking of user input (for use with problematic terminals)]' \
'(--json)--json[format output as json]' \
'(--loglevel)--loglevel[logging level for this command invocation (error*,trace,debug,info,warn,fatal)]' \
)
;;
esac
_arguments \
$_command_args \
&& return 0

View File

@@ -1,34 +0,0 @@
#compdef ssh-copy-id
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for ssh-copy-id.
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Julien Nicoulaud <julien.nicoulaud@gmail.com>
#
# ------------------------------------------------------------------------------
# FIXME This completes "user@host" and not "[user@]host" ("user@" is optional),
# should be merged in _ssh.
_arguments -A "-*" \
'-i+[use identity file]:SSH identity file:_files' \
'-n+[do a dry-run]' \
'-h+[print usage summary]' \
'-p+[specify custom port]' \
'-o+[additional ssh option]' \
'1: :_user_at_host'
# Local Variables:
# mode: Shell-Script
# sh-indentation: 2
# indent-tabs-mode: nil
# sh-basic-offset: 2
# End:
# vim: ft=zsh sw=2 ts=2 et

View File

@@ -40,54 +40,56 @@
_stack () { _stack () {
_arguments \ _arguments \
--version'[Show version]' \ --version'[display version information]' \
--help'[Show this help text]' \ --help'[display usage information]' \
'--docker*''[Run "stack --docker-help" for details]' \ '--docker*''[run "stack --docker-help" for details]' \
--verbosity'[Verbosity: silent, error, warn, info, debug]' \ --verbosity'[verbosity: silent, error, warn, info, debug]' \
{-v,--verbose}'[Enable verbose mode: verbosity level "debug"]' \ {-v,--verbose}'[enable verbose mode: verbosity level "debug"]' \
--system-ghc'[Enable using the system installed GHC (on the PATH) if available and a matching version]' \ --system-ghc'[enable using the system installed GHC (on the PATH) if available and a matching version]' \
--no-system-ghc'[Disable using the system installed GHC (on the PATH) if available and a matching version]' \ --no-system-ghc'[disable using the system installed GHC (on the PATH) if available and a matching version]' \
--install-ghc'[Enable downloading and installing GHC if necessary (can be done manually with stack setup)]' \ --install-ghc'[enable downloading and installing GHC if necessary (can be done manually with stack setup)]' \
--no-install-ghc'[Disable downloading and installing GHC if necessary (can be done manually with stack setup)]' \ --no-install-ghc'[disable downloading and installing GHC if necessary (can be done manually with stack setup)]' \
--arch'[System architecture, e.g. i386, x86_64]' \ --arch'[system architecture, e.g. i386, x86_64]' \
--os'[Operating system, e.g. linux, windows]' \ --os'[operating system, e.g. linux, windows]' \
{-j,--jobs}'[Number of concurrent jobs to run]' \ {-j,--jobs}'[number of concurrent jobs to run]' \
--extra-include-dirs'[Extra directories to check for C header files]' \ --extra-include-dirs'[extra directories to check for C header files]' \
--extra-lib-dirs'[Extra directories to check for libraries]' \ --extra-lib-dirs'[extra directories to check for libraries]' \
--skip-ghc-check'[Enable skipping the GHC version and architecture check]' \ --skip-ghc-check'[enable skipping the GHC version and architecture check]' \
--no-skip-ghc-check'[Disable skipping the GHC version and architecture check]' \ --no-skip-ghc-check'[disable skipping the GHC version and architecture check]' \
--skip-msys'[Enable skipping the local MSYS installation (Windows only)]' \ --skip-msys'[enable skipping the local MSYS installation (Windows only)]' \
--no-skip-msys'[Disable skipping the local MSYS installation (Windows only)]' \ --no-skip-msys'[disable skipping the local MSYS installation (Windows only)]' \
--resolver'[Override resolver in project file]' \ --resolver'[override resolver in project file]' \
--no-terminal'[Override terminal detection in the case of running in a false terminal]' \ --no-terminal'[override terminal detection in the case of running in a false terminal]' \
--stack-yaml'[Override project stack.yaml file (overrides any STACK_YAML environment variable)]' \ --stack-yaml'[override project stack.yaml file (overrides any STACK_YAML environment variable)]' \
'*: :__stack_modes' '*: :__stack_modes'
} }
__stack_modes () { __stack_modes () {
_values \ _values \
'subcommand' \ 'subcommand' \
'build[Build the project(s) in this directory/configuration]' \ 'build[build the project(s) in this directory/configuration]' \
'install[Build executables and install to a user path]' \ 'install[build executables and install to a user path]' \
'test[Build and test the project(s) in this directory/configuration]' \ 'test[build and test the project(s) in this directory/configuration]' \
'bench[Build and benchmark the project(s) in this directory/configuration]' \ 'bench[build and benchmark the project(s) in this directory/configuration]' \
'haddock[Generate haddocks for the project(s) in this directory/configuration]' \ 'haddock[generate haddocks for the project(s) in this directory/configuration]' \
'new[Create a brand new project]' \ 'new[create a brand new project]' \
'init[Initialize a stack project based on one or more cabal packages]' \ 'init[initialize a stack project based on one or more cabal packages]' \
'solver[Use a dependency solver to try and determine missing extra-deps]' \ 'solver[use a dependency solver to try and determine missing extra-deps]' \
'setup[Get the appropriate ghc for your project]' \ 'setup[get the appropriate ghc for your project]' \
'path[Print out handy path information]' \ 'path[print out handy path information]' \
'unpack[Unpack one or more packages locally]' \ 'unpack[unpack one or more packages locally]' \
'update[Update the package index]' \ 'update[update the package index]' \
'upgrade[Upgrade to the latest stack (experimental)]' \ 'upgrade[upgrade to the latest stack (experimental)]' \
'upload[Upload a package to Hackage]' \ 'upload[upload a package to Hackage]' \
'dot[Visualize your projects dependency graph using Graphviz dot]' \ 'dot[visualize your projects dependency graph using Graphviz dot]' \
'exec[Execute a command]' \ 'exec[execute a command]' \
'ghc[Run ghc]' \ 'ghc[run ghc]' \
'ghci[Run ghci in the context of project(s)]' \ 'ghci[run ghci in the context of project(s)]' \
'ide[Run ide-backend-client with the correct arguments]' \ 'ide[run ide-backend-client with the correct arguments]' \
'runghc[Run runghc]' \ 'runghc[run runghc]' \
'clean[Clean the local packages]' \ 'clean[clean the local packages]' \
'docker[Subcommands specific to Docker use]' 'docker[subcommands specific to Docker use]'
} }
_stack "$@"

View File

@@ -39,22 +39,17 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_subl() {
integer ret=1
local I="-h --help -v --version" local I="-h --help -v --version"
local -a args local -a args
args+=( args+=(
"(- *)"{-h,--help}'[Show help and exit]' "(- *)"{-h,--help}'[display usage information]'
"(- *)"{-v,--version}'[Show version and exit]' "(- *)"{-v,--version}'[display version information]'
"($I)--project[Load the given project]:project:" "($I)--project[load the given project]:project"
"($I)--command[Run the given command]:command:" "($I)--command[run the given command]:command"
"($I -n --new-window)"{-n,--new-window}'[Open a new window]' "($I -n --new-window)"{-n,--new-window}'[open a new window]'
"($I -a --add)"{-a,--add}'[Add folders to the current window]' "($I -a --add)"{-a,--add}'[add folders to the current window]'
"($I -w --wait)"{-w,--wait}'[Wait for the files to be closed before returning]' "($I -w --wait)"{-w,--wait}'[wait for the files to be closed before returning]'
"($I -b --background)"{-b,--background}'[Do not activate the application]' "($I -b --background)"{-b,--background}"[don't activate the application]"
) )
_arguments $args[@] '*:file:_files' && ret=0 _arguments "$args[@]" '*:file:_files'
return ret
}
_subl

View File

@@ -0,0 +1,64 @@
#compdef tarsnap
# ------------------------------------------------------------------------------
# Copyright (c) 2014-2017 Daniel Teunis - https://github.com/danteu
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for tarsnap 1.0.39 (http://tarsnap.com).
#
# ------------------------------------------------------------------------------
# Author
# -------
#
# * Daniel Teunis <daniel@teunis.cc>
#
# ------------------------------------------------------------------------------
_arguments \
'(- 1 *)--help[prints tarsnap help]' \
'(- 1 *)--version[prints tarsnap version number]' \
'(- 1 *)--verify-config[checks configuration files for syntactic errors]' \
'--fsck[performs integrity checks on the stored archives]' \
'--fsck-prune[performs integrity checks and prunes broken data]' \
'--list-archives[prints names of stored archives]' \
'--nuke[deletes all stored archives]' \
'--print-stats[prints archive statistics]' \
'--recover[recovers partial archive]' \
'-c[creates archive]' \
'-d[deletes specified archive]' \
'-r[prints content of specified archive]' \
'-t[lists archive content]' \
'-x[extracts specified archive]' \
'--lowmem[reduces memory usage by not caching small files]' \
'--quiet[silences some warnings]' \
'--keyfile[specifies keyfile]:keyfile:->file' \
'--cachedir[specifies cache directory]:cachedir:->directory' \
'-f[specifies archive name]:archivename:->file'
case "$state" in
file)
_files
;;
directory)
_path_files -/
;;
esac

View File

@@ -0,0 +1,133 @@
#compdef tmuxp
# ------------------------------------------------------------------------------
# Copyright (c) 2017 Github zsh-users - http://github.com/zsh-users
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for tmuxp (https://tmuxp.git-pull.com/en/latest/)
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Bez Hermoso <bezalelhermoso@gmail.com>
#
# ------------------------------------------------------------------------------
_tmuxp() {
local curcontext="$curcontext" state line
typeset -A opt_args
_arguments -C \
":command:->command" \
"*::options:->options" \
"--log_level:log level:(DEBUG INFO WARNING ERROR CRITICAL)" \
"--help[display usage information]"
case $state in
(command)
local -a subcommands
subcommands=(
'convert:Convert a tmuxp config between JSON and YAML.'
'freeze:Snapshot a session into a config.'
'import:Import a teamocil/tmuxinator config.'
'load:Load tmuxp workspaces.'
)
_describe -t commands 'commands' subcommands
;;
(options)
case $line[1] in
(load)
__tmuxp_load
;;
(import)
__tmuxp_import
;;
(freeze)
local sessions="$(__tmux_sessions)"
_arguments -C \
"1::session name:compadd $sessions"
;;
(convert)
_arguments -C \
'1:: :_files -g "*.(json|yaml|yml)"'
;;
esac
esac
}
__tmuxp_load() {
local state line
typeset -A opt_args
_arguments -C \
'*:sessions:->sessions' \
'--yes:yes' \
'-d[Load the session without attaching it]' \
'-2[Force tmux to assume the terminal supports 256 colors]' \
'-8[Like -2, but indicates that the terminal supports 88 colors]'
# Cant get the options to be recognized when there are sessions that has
# a dash.
case $state in
(sessions)
local s
_alternative \
'sessions-user:user session:compadd -F line - ~/.tmuxp/*.(json|yml|yaml)(:r:t)' \
'sessions-other:session in current directory:_path_files -g "**/*.(json|yaml|yml)(-.)"'
;;
esac
}
__tmuxp_import() {
local state line
typeset -A opt_args
_arguments -C \
'1::program:(tmuxinator teamocil)' \
'2::project:->project'
case $state in
(project)
if [[ $line[1] == 'tmuxinator' ]]
then
_wanted tmuxinator-projects exp 'tmuxinator projects' compadd $(tmuxinator completions start)
fi
;;
esac
}
__tmux_sessions () {
local tmux_sessions
tmux_sessions=($(_call_program tmux_sessions 'tmux ls -F "#{session_name}"'))
echo $tmux_sessions
}
_tmuxp "$@"

View File

@@ -0,0 +1,61 @@
#compdef tox
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for tox (https://tox.readthedocs.io).
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Julien Nicoulaud <julien.nicoulaud@gmail.com>
#
# ------------------------------------------------------------------------------
(( $+functions[_tox_envs_list] )) ||
_tox_envs_list() {
local envs; envs=($(_call_program envs $service --listenvs-all))
if [ ${#envs} -gt 0 ]; then
_values -s , 'tox environments' "${envs[@]}"
else
_message 'tox environments (none found)'
fi
}
_arguments \
'(- 1 *)--version[show version and exit]' \
'(- 1 *)'{-h,--help}'[show help options]' \
'(- 1 *)'{--hi,--help-ini}'[show help about ini-names]' \
'*-v[increase verbosity of reporting output]' \
'*-q[progressively silence reporting output]' \
'(- 1 *)--showconfig[show configuration information for all environments]' \
'(- 1 *)'{-l,--listenvs}'[show list of test environments]' \
'(- 1 *)'{-a,--listenvs-all}'[show list of all defined environments]' \
'-c[config file name or directory with "tox.ini" file]:config path:_files -g "*.ini"' \
'-e[work against specified environments]: :_tox_envs_list' \
'--notest[skip invoking test commands]' \
'--sdistonly[only perform the sdist packaging activity]' \
'--parallel--safe-build[ensure two tox builds can run in parallel]' \
'--installpkg[ensure two tox builds can run in parallel]:package path:_files -/' \
'--develop[install package in the venv using "setup.py develop"]' \
'(-i --index-url)'{-i,--index-url}'[set indexserver url]:index server URL:_urls' \
'--pre[install pre-releases and development versions of dependencies]' \
'(-r --recreate)'{-r,--recreate}'[force recreation of virtual environments]' \
'--result-json[write a json file with detailed information about all commands and results involved]:JSON file path:_files -g "*.json"' \
'--hashseed[set PYTHONHASHSEED to SEED before running commands]:seed' \
'*--force-dep[forces a certain version of one of the dependencies when configuring the virtual environment]:pip requirement' \
'--sitepackages[override sitepackages setting to True in all envs]' \
'--alwayscopy[override alwayscopy setting to True in all envs]' \
'--skip-missing-interpreters[do not fail tests for missing interpreters]: :(config true false)' \
'--workdir[tox working directory]: :_files -/' \
'*: :_guard "^-*" command positional substitution arguments'
# Local Variables:
# mode: Shell-Script
# sh-indentation: 2
# indent-tabs-mode: nil
# sh-basic-offset: 2
# End:
# vim: ft=zsh sw=2 ts=2 et

View File

@@ -39,16 +39,10 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_trash-list() { _arguments \
'--version[show programs version number and exit]' \
_arguments -C \ {-h,--help}'[show help message and exit]' \
'--version[Show programs version number and exit]' \ '*: :'
{-h,--help}'[Show help message and exit]' \
'*: :' \
}
_trash-list
# Local Variables: # Local Variables:
# mode: Shell-Script # mode: Shell-Script

View File

@@ -39,21 +39,15 @@
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_trash-put() {
_arguments -C \ _arguments -C \
'--version[Show programs version number and exit]' \ '--version[display version information]' \
{-h,--help}'[Show help message and exit]' \ {-h,--help}'[display usage information]' \
{-d,--directory}'[Remove empty directories - ignored (for GNU rm compabilty)]' \ {-d,--directory}'[remove empty directories - ignored (for GNU rm compatibility)]' \
{-f,--force}'[Ignore nonexistent arguments and never prompt - ignored (for GNU rm compabilty)]' \ {-f,--force}'[ignore nonexistent arguments and never prompt - ignored (for GNU rm compatibility)]' \
{-i,--interactive}'[Prompt before every removal - ignored (for GNU rm compabilty)]' \ {-i,--interactive}'[prompt before every removal - ignored (for GNU rm compabilty)]' \
{-r,-R,--recursive}'[Remove directories and their content recursively - ignored (for GNU rm compabilty)]' \ {-r,-R,--recursive}'[remove directories and their content recursively - ignored (for GNU rm compatibility)]' \
{-v,--verbose}'[Explain what is being done]' \ {-v,--verbose}'[explain what is being done]' \
'*: :_files' \ '*: :_files'
}
_trash-put
# Local Variables: # Local Variables:
# mode: Shell-Script # mode: Shell-Script

View File

@@ -56,7 +56,7 @@ _paths() {
_path_list+=$_path _path_list+=$_path
done done
_describe 'paths' _path_list _describe 'path' _path_list
} }
_filesystems() { _filesystems() {
@@ -68,96 +68,97 @@ _filesystems() {
'xenix' 'xfs' 'xiafs' 'xenix' 'xfs' 'xiafs'
) )
_describe 'file system types' _fs_types _describe 'file system type' _fs_types
} }
_udisksctl() { _udisksctl() {
typeset -A opt_args typeset -A opt_args
local curcontext="$curcontext" state line local curcontext="$curcontext" state line ret=1
_arguments -C \ _arguments -C \
'1:udisksctl commands:->cmds' \ '1:udisksctl commands:->cmds' \
'*:: :->cmd_args' \ '*:: :->cmd_args' && ret=0
case $state in case $state in
cmds) cmds)
local commands; commands=( local commands; commands=(
'help: Show help' 'help:show help'
'info: Show info about an object' 'info:show info about an object'
'dump: Show info about all object' 'dump:show info about all object'
'status: Shows high-level status' 'status:shows high-level status'
'monitor: Monitor changes to objects' 'monitor:monitor changes to objects'
'mount: Mount a filesystem' 'mount:mount a filesystem'
'unmount: Unmount a filesystem' 'unmount:unmount a filesystem'
'unlock: Unlock an encrypted device' 'unlock:unlock an encrypted device'
'lock: Lock an encrypted device' 'lock:lock an encrypted device'
'loop-setup: Set-up a loop device' 'loop-setup:set-up a loop device'
'loop-delete: Delete a loop device' 'loop-delete:delete a loop device'
'power-off: Safely power off a drive' 'power-off:safely power off a drive'
'smart-simulate: Set SMART data for a drive' 'smart-simulate:set SMART data for a drive'
) )
_describe -t commands 'udisksctl commands' commands _describe -t commands 'udisksctl command' commands && ret=0
;; ;;
cmd_args) cmd_args)
case $words[1] in case $words[1] in
info) info)
_arguments \ _arguments \
{-p,--object-path}'[Object to get information about]:object path:_paths' \ {-p,--object-path}'[specify object to get information about]:object path:_paths' \
{-b,--block-device}'[Block device to get information about]:block device:_paths' \ {-b,--block-device}'[specify block device to get information about]:block device:_paths' \
{-d,--drive}'[Drive to get information about]:drives:_paths' \ {-d,--drive}'[specify drive to get information about]:drive:_paths' && ret=0
;; ;;
mount) mount)
_arguments \ _arguments \
{-p,--object-path}'[Object to mount]:object path:_paths' \ {-p,--object-path}'[specify object to mount]:object path:_paths' \
{-b,--block-device}'[Block device to mount]:block device:_paths' \ {-b,--block-device}'[specify block device to mount]:block device:_paths' \
{-t,--filesystem-type}'[Filesystem type to use]:fs type:_filesystems' \ {-t,--filesystem-type}'[specify filesystem type to use]:fs type:_filesystems' \
{-o,--options}'[Mount options]' \ {-o,--options}'[mount options]' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't authenticate the user if needed]" && ret=0
;; ;;
unmount) unmount)
_arguments \ _arguments \
{-p,--object-path}'[Object to unmount]:object path:_paths' \ {-p,--object-path}'[object to unmount]:object path:_paths' \
{-b,--block-device}'[Block device to unmount]:block device:_paths' \ {-b,--block-device}'[block device to unmount]:block device:_paths' \
{-f,--force}'[Force/lazy unmount]' \ {-f,--force}'[force/lazy unmount]' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't auhenticate the user if needed]" && ret=0
;; ;;
unlock|lock) unlock|lock)
_arguments \ _arguments \
{-p,--object-path}'[Object to lock/unlock]:object path:_paths' \ {-p,--object-path}'[object to lock/unlock]:object path:_paths' \
{-b,--block-device}'[Block device to lock/unlock]:block device:_paths' \ {-b,--block-device}'[block device to lock/unlock]:block device:_paths' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't authenticate the user if needed]" && ret=0
;; ;;
loop-setup) loop-setup)
_arguments \ _arguments \
{-f,--file}'[File to set-up a loop device for]:files:_files' \ {-f,--file}'[specify file to set-up a loop device for]:files:_files' \
{-r,--read-only}'[Setup read-only device]' \ {-r,--read-only}'[setup read-only device]' \
{-o,--offset}'[Start at <num> bytes into file]:offset in bytes:' \ {-o,--offset}'[start at specified offset into file]:offset (bytes)' \
{-s,--size}'[Limit size to <num> bytes]:limit in bytes:' \ {-s,--size}'[limit size]:limit (bytes)' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't authenticate the user if needed]" && ret=0
;; ;;
loop-delete) loop-delete)
_arguments \ _arguments \
{-p,--object-path}'[Object for loop device to delete]:object path:_paths' \ {-p,--object-path}'[object for loop device to delete]:object path:_paths' \
{-b,--block-device}'[Loop device to delete]:block device:_paths' \ {-b,--block-device}'[loop device to delete]:block device:_paths' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't authenticate the user if needed]" && ret=0
;; ;;
power-off) power-off)
_arguments \ _arguments \
{-p,--object-path}'[Object path for ATA device]:object path:_paths' \ {-p,--object-path}'[object path for ATA device]:object path:_paths' \
{-b,--block-device}'[Device file for ATA devic]:block device:_paths' \ {-b,--block-device}'[device file for ATA devic]:block device:_paths' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't authenticate the user if needed]" && ret=0
;; ;;
smart-simulate) smart-simulate)
_arguments \ _arguments \
{-f,--file}'[File with libatasmart blob]:files:_files' \ {-f,--file}'[file with libatasmart blob]:files:_files' \
{-p,--object-path}'[Object to get information about]:object path:_paths' \ {-p,--object-path}'[object to get information about]:object path:_paths' \
{-b,--block-device}'[Block device to get information about]:block device:_paths' \ {-b,--block-device}'[block device to get information about]:block device:_paths' \
'(--no-user-interaction)--no-user-interaction[Do not authenticate the user if needed]' \ "(--no-user-interaction)--no-user-interaction[don't authenticate the user if needed]" && ret=0
;; ;;
esac esac
;; ;;
esac esac
return ret
} }
_udisksctl "$@" _udisksctl "$@"

View File

@@ -41,7 +41,7 @@
_ufw_logging() { _ufw_logging() {
local main additional second local params additional second
second=$words[2] second=$words[2]
if [ ! -z $second ]; then if [ ! -z $second ]; then
@@ -60,8 +60,8 @@ _ufw_logging() {
"full" "full"
) )
_describe -t params 'On/Off' params _describe -t params 'on/off' params
_describe -t additional 'Level' additional _describe -t additional 'level' additional
} }
@@ -85,35 +85,34 @@ _ufw_delete() {
} }
_ufw() { _ufw() {
local -a commands local curcontext="$curcontext" ret=1
local -a state line commands
commands=( commands=(
"enable:Enables the firewall" "enable:enable the firewall"
"disable:Disables the firewall" "disable:disable the firewall"
"default:Set default policy" "default:set default policy"
"logging:Set logging to LEVEL" "logging:set logging level"
"allow:Add allow rule" "allow:add allow rule"
"deny:Add deny rule" "deny:add deny rule"
"reject:Add reject rule" "reject:add reject rule"
"limit:Add limit rule" "limit:add limit rule"
"delete:Delete RULE" "delete:delete rule"
"insert:Insert RULE at NUM" "insert:insert rule at position"
"route:Add route RULE" "route:add route rule"
"reload:Reload firewall" "reload:reload firewall"
"reset:Reset firewall" "reset:reset firewall"
"status:Show firewall status" "status:show firewall status"
"show:Show firewall report" "show:show firewall report"
"version:Display version information" "version:display version information"
) )
_arguments -C -s -S -n \ _arguments -C -s -S -n \
'(- 1 *)'--version"[Show program\'s version number and exit]: :->full" \ '(- 1 *)'--version"[display version information]: :->full" \
'(- 1 *)'{-h,--help}'[Show help message and exit]: :->full' \ '(- 1 *)'{-h,--help}'[display usage information]: :->full' \
'(- 1 *)'--dry-run"[Don\'t modify anything, just show the changes]: :->cmds" \ '(- 1 *)'--dry-run"[don't modify anything, just show the changes]: :->cmds" \
'1:cmd:->cmds' \ '1:cmd:->cmds' \
'*:: :->args' \ '*:: :->args' && ret=0
case "$state" in case "$state" in
(cmds) (cmds)
@@ -124,19 +123,21 @@ _ufw() {
cmd=$words[1] cmd=$words[1]
case "$cmd" in case "$cmd" in
(logging) (logging)
_ufw_logging _ufw_logging && ret=0
;; ;;
(delete) (delete)
_ufw_delete _ufw_delete && ret=0
;; ;;
(*) (*)
return _default && ret=0
;; ;;
esac esac
;; ;;
(*) (*)
;; ;;
esac esac
return ret
} }
_ufw _ufw

View File

@@ -1,278 +0,0 @@
#compdef virsh
# ------------------------------------------------------------------------------
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# * Neither the name of the zsh-users nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL ZSH-USERS BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for LibVirt's Shell (virsh) (http://libvirt.org).
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Jan-Philipp Litza <janphilipp@litza.de>
# * Vadim A. Misbakh-Soloviov <mva@mva.name>
#
# ------------------------------------------------------------------------------
local -a args reply
function _virsh-domains() {
local -a out
out=( ${${${${(f)"$(virsh list "$@")"}:#(---| Id)*}## #[0-9-]## ##}%% *} )
_describe -t domains "${${1#--}:-running} domains" out
return $?
}
local -a expand_files
expand_files=(/$'[^\0]##\0'/ ':file:file name:_files')
local -a expand_dirs
expand_dirs=(/$'[^\0]##\0'/ ":dir:dir name:_files -/")
local -a expand_domains
expand_domains=(
/$'[^\0]##\0'/
":domains:domains:{_virsh-domains --all}"
)
local -a expand_domains_running
expand_domains_running=(
/$'[^\0]##\0'/
":domains:running domains:{_virsh-domains}"
)
local -a expand_domains_inactive
expand_domains_inactive=(
/$'[^\0]##\0'/
":domains:inactive domains:{_virsh-domains --inactive}"
)
local -a expand_kilobytes
expand_kilobytes=(
/'\d+'/
":kilobytes:kilobytes:"
)
local -a expand_count
expand_count=(
/'\d+'/
":count:count:"
)
local -a subcmd_help
subcmd_help=(
/$'[^\0]##\0'/
":command:virsh command:"
)
local -a subcmd_list
_regex_words \
list-options 'list options' \
'--all:list all domains, not only active' \
'--inactive:list only inactive domains'
subcmd_list=("$reply[@]")
local -a subcmd_freecell
subcmd_freecell=(
/'\d*'/
":numa-cell:NUMA cell:"
)
local -a subcmd_autostart
_regex_words \
autostart-options 'autostart options' \
'--disable:disables autostart instead of enabling'
subcmd_autostart=("(" "$reply[@]" "|" ")" $expand_domains)
local -a subcmd_create
_regex_words \
create-options 'create options' \
'--console:attach to the console after creation' \
'--paused:pause domain after creation (if supported by driver)'
subcmd_create=("(" "$reply[@]" "|" ")" $expand_files)
local -a subcmd_dumpxml
_regex_words \
dumpxml-options 'dumpxml options' \
'--inactive:use the config for next reboot instead of current' \
'--security-info:include security sensitive information' \
'--update-cpu:update CPU requirements according to host CPU'
subcmd_dumpxml=("(" "$reply[@]" "|" ")" $expand_domains)
local -a subcmd_start
_regex_words \
start-options 'start options' \
'--console:attach to console after start' \
'--paused: pause after start (if supported by driver)'
subcmd_start=("(" "$reply[@]" "|" ")" $expand_domains_inactive)
args=( /$'[^\0]#\0'/ )
_regex_words \
commands "virsh command" \
'help:print help:$subcmd_help' \
'attach-device:attach device from an XML file:$expand_domains $expand_files' \
'attach-disk:attach disk device' \
'attach-interface:attach network interface' \
'autostart:autostart a domain:$subcmd_autostart' \
'capabilities:capabilities:' \
'cd:change the current directory:$expand_dirs' \
'connect:(re)connect to hypervisor' \
'console:connect to the guest console:$expand_domains_running' \
'cpu-baseline:compute baseline CPU:$expand_files' \
'cpu-compare:compare host CPU with a CPU described by an XML file:$expand_files' \
'create:create a domain from an XML file:$subcmd_create' \
'start:start a (previously defined) inactive domain:$subcmd_start' \
'destroy:destroy a domain:$expand_domains_running' \
'detach-device:detach device from an XML file:$expand_domains $expand_files' \
'detach-disk:detach disk device' \
'detach-interface:detach network interface' \
'define:define (but don'"\'"'t start) a domain from an XML file:$expand_files' \
'domid:convert a domain name or UUID to domain id:$expand_domains' \
'domuuid:convert a domain name or id to domain UUID:$expand_domains' \
'dominfo:domain information:$expand_domains' \
'domjobinfo:domain job information:$expand_domains' \
'domjobabort:abort active domain job:$expand_domains' \
'domname:convert a domain id or UUID to domain name:$expand_domains' \
'domstate:domain state:$expand_domains' \
'domblkstat:get device block stats for a domain' \
'domifstat:get network interface stats for a domain' \
'dommemstat:get memory statistics for a domain:$expand_domains_running' \
'domblkinfo:domain block device size information' \
'domxml-from-native:Convert native config to domain XML' \
'domxml-to-native:Convert domain XML to native config' \
'dumpxml:domain information in XML:$subcmd_dumpxml' \
'edit:edit XML configuration for a domain:$expand_domains' \
'find-storage-pool-sources:discover potential storage pool sources' \
'find-storage-pool-sources-as:find potential storage pool sources' \
'freecell:NUMA free memory:$subcmd_freecell' \
'hostname:print the hypervisor hostname:' \
'list:list domains:$subcmd_list' \
'migrate:migrate domain to another host' \
'migrate-setmaxdowntime:set maximum tolerable downtime' \
'net-autostart:autostart a network' \
'net-create:create a network from an XML file:$expand_files' \
'net-define:define (but don'"\'"'t start) a network from an XML file:$expand_files' \
'net-destroy:destroy a network' \
'net-dumpxml:network information in XML' \
'net-edit:edit XML configuration for a network' \
'net-list:list networks' \
'net-name:convert a network UUID to network name' \
'net-start:start a (previously defined) inactive network' \
'net-undefine:undefine an inactive network' \
'net-uuid:convert a network name to network UUID' \
'iface-list:list physical host interfaces' \
'iface-name:convert an interface MAC address to interface name' \
'iface-mac:convert an interface name to interface MAC address' \
'iface-dumpxml:interface information in XML' \
'iface-define:define (but don'"\'"'t start) a physical host interface from an XML file' \
'iface-undefine:undefine a physical host interface (remove it from configuration)' \
'iface-edit:edit XML configuration for a physical host interface' \
'iface-start:start a physical host interface (enable it / "if-up")' \
'iface-destroy:destroy a physical host interface (disable it / "if-down")' \
'managedsave:managed save of a domain state:$expand_domains_running' \
'managedsave-remove:Remove managed save of a domain:$expand_domains' \
'nodeinfo:node information:' \
'nodedev-list:enumerate devices on this host' \
'nodedev-dumpxml:node device details in XML' \
'nodedev-dettach:dettach node device from its device driver' \
'nodedev-reattach:reattach node device to its device driver' \
'nodedev-reset:reset node device' \
'nodedev-create:create a device defined by an XML file on the node' \
'nodedev-destroy:destroy a device on the node' \
'nwfilter-define:define or update a network filter from an XML file:$expand_files' \
'nwfilter-undefine:undefine a network filter' \
'nwfilter-dumpxml:network filter information in XML' \
'nwfilter-list:list network filters' \
'nwfilter-edit:edit XML configuration for a network filter' \
'pool-autostart:autostart a pool' \
'pool-build:build a pool' \
'pool-create:create a pool from an XML file:$expand_files' \
'pool-create-as:create a pool from a set of args' \
'pool-define:define (but don'"\'"'t start) a pool from an XML file:$expand_files' \
'pool-define-as:define a pool from a set of args' \
'pool-destroy:destroy a pool' \
'pool-delete:delete a pool' \
'pool-dumpxml:pool information in XML' \
'pool-edit:edit XML configuration for a storage pool' \
'pool-info:storage pool information' \
'pool-list:list pools' \
'pool-name:convert a pool UUID to pool name' \
'pool-refresh:refresh a pool' \
'pool-start:start a (previously defined) inactive pool' \
'pool-undefine:undefine an inactive pool' \
'pool-uuid:convert a pool name to pool UUID' \
'secret-define:define or modify a secret from an XML file:$expand_files' \
'secret-dumpxml:secret attributes in XML' \
'secret-set-value:set a secret value' \
'secret-get-value:Output a secret value' \
'secret-undefine:undefine a secret' \
'secret-list:list secrets' \
'pwd:print the current directory:' \
'quit:quit this interactive terminal:' \
'exit:quit this interactive terminal:' \
'reboot:reboot a domain:$expand_domains' \
'restore:restore a domain from a saved state in a file:$expand_files' \
'resume:resume a domain:$expand_domains_running' \
'save:save a domain state to a file:$expand_domains $expand_files' \
'schedinfo:show/set scheduler parameters' \
'dump:dump the core of a domain to a file for analysis:$expand_domains $expand_files' \
'shutdown:gracefully shutdown a domain:$expand_domains_running' \
'setmem:change memory allocation:$expand_domains $expand_kilobytes' \
'setmaxmem:change maximum memory limit:$expand_domains $expand_kilobytes' \
'setvcpus:change number of virtual CPUs:$expand_domains $expand_count' \
'suspend:suspend a domain:$expand_domains_running' \
'ttyconsole:tty console:$expand_domains_running' \
'undefine:undefine an inactive domain:$expand_domains_inactive' \
'update-device:update device from an XML file' \
'uri:print the hypervisor canonical URI:' \
'vol-create:create a vol from an XML file' \
'vol-create-from:create a vol, using another volume as input' \
'vol-create-as:create a volume from a set of args' \
'vol-clone:clone a volume.' \
'vol-delete:delete a vol' \
'vol-wipe:wipe a vol' \
'vol-dumpxml:vol information in XML' \
'vol-info:storage vol information' \
'vol-list:list vols' \
'vol-pool:returns the storage pool for a given volume key or path' \
'vol-path:returns the volume path for a given volume name or key' \
'vol-name:returns the volume name for a given volume key or path' \
'vol-key:returns the volume key for a given volume name or path' \
'vcpuinfo:domain vcpu information:$expand_domains_running' \
'vcpupin:control domain vcpu affinity' \
'version:show version:' \
'vncdisplay:vnc display:$expand_domains_running' \
'snapshot-create:Create a snapshot:$expand_domains $expand_files' \
'snapshot-current:Get the current snapshot:$expand_domains' \
'snapshot-delete:Delete a domain snapshot' \
'snapshot-dumpxml:Dump XML for a domain snapshot' \
'snapshot-list:List snapshots for a domain:$expand_domains' \
'snapshot-revert:Revert a domain to a snapshot'
args+=("$reply[@]")
_regex_arguments _virsh "${args[@]}"
_virsh "$@"

View File

@@ -38,73 +38,75 @@
# #
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
local context state state_descr line local curcontext="$curcontext" state state_descr line ret=1
typeset -A opt_args typeset -A opt_args
local period="-d --days -h --hours -m --months -w --weeks" local period="-d --days -h --hours -m --months -w --weeks"
_arguments : \ _arguments -C : \
'--cleartop[Remove all top 10 entries]' \ '--cleartop[remove all top 10 entries]' \
'--config[Specify alternate configuration file]:file:_files' \ '--config[specify alternate configuration file]:file:_files' \
"($period)"{-d,--days}'[Show traffic for days]' \ "($period)"{-d,--days}'[show traffic for days]' \
"($period)"{-h,--hours}'[Show trafic for last 24 hours]' \ "($period)"{-h,--hours}'[show trafic for last 24 hours]' \
"($period)"{-m,--months}'[Show traffic for months]' \ "($period)"{-m,--months}'[show traffic for months]' \
"($period)"{-w,--weeks}'[Show traffic for 7 days]' \ "($period)"{-w,--weeks}'[show traffic for 7 days]' \
'--dbdir[Specify database directory]:directory:_files -/' \ '--dbdir[specify database directory]:directory:_files -/' \
'(-D --debug)'{-D,--debug}'[Show additional debug output]' \ '(-D --debug)'{-D,--debug}'[show additional debug output]' \
'--delete[Delete database and stop monitoring selected interface]' \ '--delete[delete database and stop monitoring selected interface]' \
'--dumpdb[Dump database in parseable format]' \ '--dumpdb[dump database in parseable format]' \
'(--disable)--enable[Enable updates for selected interface]' \ '(--disable)--enable[enable updates for selected interface]' \
'(--enable)--disable[Disable updates for selected interface]' \ '(--enable)--disable[disable updates for selected interface]' \
'(-i --iface)'{-i,--iface}'[Specify interface for actions]:interface:->interface' \ '(-i --iface)'{-i,--iface}'[specify interface for actions]:interface:->interface' \
'--iflist[List available interfaces]' \ '--iflist[list available interfaces]' \
'(-l --live)'{-l,--live}'[Display current transfer rate]:mode:->live' \ '(-l --live)'{-l,--live}'[display current transfer rate]:mode:->live' \
'--locale[Specify locale]:locale:' \ '--locale[specify locale]:locale:' \
'--nick[Set nickname for alias]:nickname:' \ '--nick[set nickname for alias]:nickname:' \
'--oneline[Traffic summary in one-line, parseable format]' \ '--oneline[traffic summary in one-line, parseable format]' \
'(-q --query)'{-q,--query}'[Force database query mode]' \ '(-q --query)'{-q,--query}'[force database query mode]' \
'(-r --reset)'{-r,--reset}'[Reset internal counters]' \ '(-r --reset)'{-r,--reset}'[reset internal counters]' \
'--rebuildtotal[Reset total trafic counters]' \ '--rebuildtotal[reset total trafic counters]' \
'(-ru --rateunit)'{-ru,--rateunit}'[Swap configured rate unit]' \ '(-ru --rateunit)'{-ru,--rateunit}'[swap configured rate unit]' \
'--savemerged[Write result of database merge]' \ '--savemerged[write result of database merge]' \
'(-s --short)'{-s,--short}'[Use short output mode]' \ '(-s --short)'{-s,--short}'[use short output mode]' \
'--style[Modify content and style of output]:style:->style' \ '--style[modify content and style of output]:style:->style' \
'--sync[Synchronize internal counters]' \ '--sync[synchronize internal counters]' \
'--testkernel[Test kernel boot time information]' \ '--testkernel[test kernel boot time information]' \
'(-t --top10)'{-t,--top10}'[Show all time top 10 traffic days]' \ '(-t --top10)'{-t,--top10}'[show all time top 10 traffic days]' \
'-tr[Calculate amount of traffic in given time]:seconds:' \ '-tr[calculate amount of traffic in given time]:seconds:' \
'(-u --update)'{-u,--update}'[Update databases]' \ '(-u --update)'{-u,--update}'[update databases]' \
'--xml[Show database content in XML format]' \ '--xml[show database content in XML format]' \
'(-)'{-v,--version}'[Show current version]' \ '(-)'{-v,--version}'[show current version]' \
'(-)'{-\?,--help}'[Show command summary]' \ '(-)'{-\?,--help}'[show command summary]' \
'(-)--longhelp[Show complete options list]' \ '(-)--longhelp[show complete options list]' \
&& ret=0 && ret=0
case "$state" in case "$state" in
(live) (live)
_values 'Live mode' \ _values 'live mode' \
'0[Packets per second]' \ '0[packets per second]' \
'1[Traffic counters]' '1[traffic counters]' && ret=0
;; ;;
(style) (style)
_values 'Output style' \ _values 'output style' \
'0[Narrow output]' \ '0[narrow output]' \
'1[Enable bar column]' \ '1[enable bar column]' \
'2[Bar column with average traffic rate]' \ '2[bar column with average traffic rate]' \
'3[Enable average traffic rate]' \ '3[enable average traffic rate]' \
'4[Disable use of terminal control characters]' '4[disable use of terminal control characters]' && ret=0
;; ;;
(interface) (interface)
local interfaces="$(_call_program interfaces 'vnstat --iflist')" local interfaces="$(_call_program interfaces 'vnstat --iflist')"
interfaces="${interfaces#*:}" interfaces="${interfaces#*:}"
_values -s + 'Interface' ${(z)interfaces} _values -s + 'interface' ${(z)interfaces} && ret=0
;; ;;
esac esac
return ret
# Local Variables: # Local Variables:
# mode: Shell-Script # mode: Shell-Script
# sh-indentation: 2 # sh-indentation: 2

View File

@@ -35,13 +35,13 @@ _vpnc-connect() {
'--ifname[visible name of the TUN/TAP interface]:name' \ '--ifname[visible name of the TUN/TAP interface]:name' \
'--ifmode[mode of TUN/TAP interface]: :_vpnc_tun_tap_modes' \ '--ifmode[mode of TUN/TAP interface]: :_vpnc_tun_tap_modes' \
'--debug[show verbose debug messages]: :_vpnc_debug_levels' \ '--debug[show verbose debug messages]: :_vpnc_debug_levels' \
'--no-detach[Don'\''t detach from the console after login]' \ "--no-detach[don't detach from the console after login]" \
'--pid-file[store the pid of background process in the file]:pid file:_files' \ '--pid-file[store the pid of background process in the file]:pid file:_files' \
'--local-addr[local IP to use for ISAKMP/ESP/...]: :_hosts' \ '--local-addr[local IP to use for ISAKMP/ESP/...]: :_hosts' \
'--local-port[local ISAKMP port number to use]: :_vpnc_isakmp_port_numbers' \ '--local-port[local ISAKMP port number to use]: :_vpnc_isakmp_port_numbers' \
'--udp-port[local UDP port number to use]: :_vpnc_udp_port_numbers' \ '--udp-port[local UDP port number to use]: :_vpnc_udp_port_numbers' \
'--dpd-idle[send DPD packet after not receiving anything for X seconds]: :_vpnc_dpd_idle_times' \ '--dpd-idle[send DPD packet after not receiving anything for X seconds]: :_vpnc_dpd_idle_times' \
'--non-inter[Don'\''t ask anything, exit on missing options]' \ "--non-inter[don't ask anything, exit on missing options]" \
'--auth-mode[authentication mode]: :_vpnc_authentication_modes' \ '--auth-mode[authentication mode]: :_vpnc_authentication_modes' \
'--ca-file[filename and path to the CA-PEM-File]:CA-PEM file:_files' \ '--ca-file[filename and path to the CA-PEM-File]:CA-PEM file:_files' \
'--ca-dir[path of the trusted CA-Directory]:CA directory:_files -/' \ '--ca-dir[path of the trusted CA-Directory]:CA directory:_files -/' \
@@ -116,7 +116,7 @@ _vpnc_tun_tap_modes() {
(( $+functions[_vpnc_debug_levels] )) || (( $+functions[_vpnc_debug_levels] )) ||
_vpnc_debug_levels() { _vpnc_debug_levels() {
local levels; levels=( local levels; levels=(
'0:do not print debug information' "0:don't print debug information"
'1:print minimal debug information' '1:print minimal debug information'
'2:show statemachine and packet/payload type information' '2:show statemachine and packet/payload type information'
'3:dump everything excluding authentication data' '3:dump everything excluding authentication data'

View File

@@ -0,0 +1,25 @@
#compdef wg-quick
# ------------------------------------------------------------------------------
# Description
# -----------
#
# Completion script for wg-quick (a script for easy managemant of wireguard
# VPN tunnels) (https://www.wireguard.com/)
#
# ------------------------------------------------------------------------------
# Authors
# -------
#
# * Nicolas Lenz <nicolas@eisfunke.com>
#
# ------------------------------------------------------------------------------
# The possible modes
local modes=('up\:"bring a wireguard interface up"'\
'down\:"tear down and remove a wireguard interface"'\
'save\:"save configuration of a running wireguard interface"')
# 1: Complete mode
# 2: Complete interface with all .conf files in /etc/wireguard without the filename extension.
_arguments "1:mode:((${modes}))"\
'2:interface:_path_files -W /etc/wireguard -g "*.conf(^/:r)"'

View File

@@ -92,7 +92,7 @@ _yaourt_action_sync() {
_arguments -s : \ _arguments -s : \
"$_yaourt_opts_common[@]" \ "$_yaourt_opts_common[@]" \
"$_yaourt_opts_sync_modifiers[@]" \ "$_yaourt_opts_sync_modifiers[@]" \
'*-c[Remove old packages from cache]' \ '*-c[remove old packages from cache]' \
;; ;;
sync_group) sync_group)
_arguments -s : \ _arguments -s : \
@@ -213,110 +213,110 @@ _yaourt() {
# options for passing to _arguments: main pacman commands # options for passing to _arguments: main pacman commands
typeset -a _yaourt_opts_commands typeset -a _yaourt_opts_commands
_yaourt_opts_commands=( _yaourt_opts_commands=(
'-Q[Query the package database]' '-Q[query the package database]'
'-R[Remove a package from the system]' '-R[remove a package from the system]'
'-S[Synchronize packages]' '-S[synchronize packages]'
'-U[Upgrade a package]' '-U[upgrade a package]'
'-V[Display version and exit]' '-V[display version and exit]'
'-h[Display usage]' '-h[display usage information]'
'-B[backup pacman database]' '-B[backup pacman database]'
'-G[Get PKGBUILD]' '-G[get PKGBUILD]'
'-C[Clean backup files]' '-C[clean backup files]'
'--stats[Package Statistics]' '--stats[package statistics]'
) )
# options for passing to _arguments: options common to all commands # options for passing to _arguments: options common to all commands
typeset -a _yaourt_opts_common typeset -a _yaourt_opts_common
_yaourt_opts_common=( _yaourt_opts_common=(
'-b[Alternate database location]:database_location:_files -/' '-b[alternate database location]:database_location:_files -/'
'-h[Display syntax for the given operation]' '-h[display syntax for the given operation]'
'-r[Alternate installation root]:installation root:_files -/' '-r[alternate installation root]:installation root:_files -/'
'-v[Be more verbose]' '-v[be more verbose]'
'--cachedir[Alternate package cache location]:cache_location:_files -/' '--cachedir[specify alternate package cache location]:cache_location:_files -/'
'--config[Alternate configuration file]:config file:_files' '--config[specify alternate configuration file]:config file:_files'
'--debug[Display debug messages]' '--debug[display debug messages]'
'--gpgdir[Alternate GnuPG home directory]:gpg_directory:_files -/' '--gpgdir[specify alternate GnuPG home directory]:gpg_directory:_files -/'
'--logfile[Alternate log file]:config file:_files' '--logfile[specify alternate log file]:config file:_files'
'--noconfirm[Do not ask for confirmation]' "--noconfirm[don't ask for confirmation]"
'--noprogressbar[Do not show a progress bar when downloading files]' "--noprogressbar[don't show a progress bar when downloading files]"
'--noscriptlet[Do not execute the install scriptlet if one exists]' "--noscriptlet[don't execute the install scriptlet if one exists]"
'--print[Only print the targets instead of performing the operation]' '--print[only print the targets instead of performing the operation]'
) )
# options for passing to _arguments: options for --upgrade commands # options for passing to _arguments: options for --upgrade commands
typeset -a _yaourt_opts_pkgfile typeset -a _yaourt_opts_pkgfile
_yaourt_opts_pkgfile=( _yaourt_opts_pkgfile=(
'-d[Skip dependency checks]' '-d[skip dependency checks]'
'-f[Overwrite conflicting files]' '-f[overwrite conflicting files]'
'*:package file:_files -g "*.pkg.tar*(.)"' '*:package file:_files -g "*.pkg.tar*(.)"'
) )
# options for passing to _arguments: subactions for --query command # options for passing to _arguments: subactions for --query command
typeset -a _yaourt_opts_query_actions typeset -a _yaourt_opts_query_actions
_yaourt_opts_query_actions=( _yaourt_opts_query_actions=(
'-g[View all members of a package group]:*:package groups:->query_group' '-g[view all members of a package group]:*:package groups:->query_group'
'-o[Query the package that owns a file]:file:_files' '-o[query the package that owns a file]:file:_files'
'-p[Package file to query]:*:package file:->query_file' '-p[package file to query]:*:package file:->query_file'
'-s[Search package names and descriptions]:*:search text:->query_search' '-s[search package names and descriptions]:*:search text:->query_search'
) )
# options for passing to _arguments: options for --query and subcommands # options for passing to _arguments: options for --query and subcommands
typeset -a _yaourt_opts_query_modifiers typeset -a _yaourt_opts_query_modifiers
_yaourt_opts_query_modifiers=( _yaourt_opts_query_modifiers=(
'-c[List package changelog]' '-c[list package changelog]'
'-d[List packages installed as dependencies]' '-d[list packages installed as dependencies]'
'-e[List packages explicitly installed]' '-e[list packages explicitly installed]'
'-i[View package information]' '-i[view package information]'
'-ii[View package information including backup files]' '-ii[view package information including backup files]'
'-k[Check package files]' '-k[check package files]'
'-l[List package contents]' '-l[list package contents]'
'-m[List installed packages not found in sync db(s)]' '-m[list installed packages not found in sync db(s)]'
'-t[List packages not required by any package]' '-t[list packages not required by any package]'
'-u[List packages that can be upgraded]' '-u[list packages that can be upgraded]'
'--aur[Install packages from aur, even if they are in community, or, with the -u option, update packages installed from aur]' '--aur[install packages from aur, even if they are in community, or, with the -u option, update packages installed from aur]'
'--devel[Used with -u updates all cvs/svn/git/hg/bzr packages]' '--devel[used with -u updates all cvs/svn/git/hg/bzr packages]'
'--date[List packages sorted ascendingly (oldest is printed first) by installation date]' '--date[list packages sorted in ascending order (oldest first) by installation date]'
) )
# options for passing to _arguments: options for --remove command # options for passing to _arguments: options for --remove command
typeset -a _yaourt_opts_remove typeset -a _yaourt_opts_remove
_yaourt_opts_remove=( _yaourt_opts_remove=(
'-c[Remove all dependent packages]' '-c[remove all dependent packages]'
'-d[Skip dependency checks]' '-d[skip dependency checks]'
'-k[Only remove database entry, do not remove files]' "-k[only remove database entry, don't remove files]"
'-n[Remove protected configuration files]' '-n[remove protected configuration files]'
'-s[Remove dependencies not required by other packages]' '-s[remove dependencies not required by other packages]'
'*:installed package:_yaourt_completions_installed_packages' '*:installed package:_yaourt_completions_installed_packages'
) )
# options for passing to _arguments: options for --sync command # options for passing to _arguments: options for --sync command
typeset -a _yaourt_opts_sync_actions typeset -a _yaourt_opts_sync_actions
_yaourt_opts_sync_actions=( _yaourt_opts_sync_actions=(
'*-c[Remove old packages from cache]:*:clean:->sync_clean' '*-c[remove old packages from cache]:*:clean:->sync_clean'
'*-cc[Remove all packages from cache]:*:clean:->sync_clean' '*-cc[remove all packages from cache]:*:clean:->sync_clean'
'-g[View all members of a package group]:*:package groups:->sync_group' '-g[view all members of a package group]:*:package group:->sync_group'
'-s[Search package names and descriptions]:*:search text:->sync_search' '-s[search package names and descriptions]:*:search text:->sync_search'
) )
# options for passing to _arguments: options for --sync command # options for passing to _arguments: options for --sync command
typeset -a _yaourt_opts_sync_modifiers typeset -a _yaourt_opts_sync_modifiers
_yaourt_opts_sync_modifiers=( _yaourt_opts_sync_modifiers=(
'-d[Skip dependency checks]' '-d[skip dependency checks]'
'-f[Overwrite conflicting files]' '-f[overwrite conflicting files]'
'-i[View package information]' '-i[view package information]'
'-l[List all packages in a repository]' '-l[list all packages in a repository]'
'-p[Print download URIs for each package to be installed]' '-p[print download URIs for each package to be installed]'
'-u[Upgrade all out-of-date packages]' '-u[upgrade all out-of-date packages]'
'-w[Download packages only]' '-w[download packages only]'
'-y[Download fresh package databases]' '-y[download fresh package databases]'
'*--ignore[Ignore a package upgrade]:package: '*--ignore[ignore a package upgrade]:package:
_yaourt_completions_all_packages' _yaourt_completions_all_packages'
'*--ignoregroup[Ignore a group upgrade]:package group: '*--ignoregroup[ignore a group upgrade]:package group:
_yaourt_completions_all_groups' _yaourt_completions_all_groups'
'--asdeps[Install packages as non-explicitly installed]' '--asdeps[install packages as non-explicitly installed]'
'--asexplicit[Install packages as explicitly installed]' '--asexplicit[install packages as explicitly installed]'
'--needed[Do not reinstall up to date packages]' "--needed[don't reinstall up to date packages]"
'--devel[Used with -u updates all cvs/svn/git/hg/bzr packages]' '--devel[used with -u updates all cvs/svn/git/hg/bzr packages]'
) )
case $words[2] in case $words[2] in

View File

@@ -37,14 +37,14 @@
_commands=( _commands=(
'access' 'access'
'cache' 'autoclean:Clean and remove unnecessary files from package dependencies'
'cache:List or clean every cached package'
"check:Verify package dependencies agains yarn's lock file" "check:Verify package dependencies agains yarn's lock file"
'clean:Cleans and removes unnecessary files from package dependencies'
'config:Manages the yarn configuration files' 'config:Manages the yarn configuration files'
'generate-lock-entry:Generates a lock file entry' 'generate-lock-entry:Generates a lock file entry'
'global:Install packages globally on your operating system' 'global:Install packages globally on your operating system'
'help:Show information about a command' 'help:Show information about a command'
'import' 'import:Generate yarn.lock from an existing npm-installed node_modules folder'
'info:Show information about a package' 'info:Show information about a package'
'init:Interactively creates or updates a package.json file' 'init:Interactively creates or updates a package.json file'
'install:Install all the dependencies listed within package.json' 'install:Install all the dependencies listed within package.json'
@@ -53,16 +53,16 @@ _commands=(
'list:List installed packages' 'list:List installed packages'
'login:Store registry username and email' 'login:Store registry username and email'
'logout:Clear registry username and email' 'logout:Clear registry username and email'
'outdated:Checks for outdated package dependencies' 'outdated:Check for outdated package dependencies'
'owner:Manage package owners' 'owner:Manage package owners'
'pack:Creates a compressed gzip archive of package dependencies' 'pack:Create a compressed gzip archive of package dependencies'
'publish:Publishes a package to the npm registry' 'publish:Publish a package to the npm registry'
'run:Runs a defined package script' 'run:Run a defined package script'
'tag:Add, remove, or list tags on a package' 'tag:Add, remove, or list tags on a package'
'team:Maintain team memberships' 'team:Maintain team memberships'
'unlink:Unlink a previously created symlink for a package' 'unlink:Unlink a previously created symlink for a package'
'version:Updates the package version' 'version:Update the package version'
'versions' 'versions:Display version information of currently installed Yarn, Node.js, and its dependencies'
'why:Show information about why a package is installed' 'why:Show information about why a package is installed'
) )
@@ -76,13 +76,13 @@ _global_commands=(
_yarn_commands_scripts() { _yarn_commands_scripts() {
local -a scripts local -a scripts
scripts=($(yarn run --json 2>/dev/null | sed -E '/Commands available|possibleCommands/!d;s/.*Commands available from binary scripts: ([^"]+)".*/\1/;s/.*"items":\[([^]]+).*/\1/;s/[" ]//g' | tr , '\n')) scripts=($(yarn run --json 2>/dev/null | sed -E '/Commands available|possibleCommands/!d;s/.*Commands available from binary scripts: ([^"]+)".*/\1/;s/.*"items":\[([^]]+).*/\1/;s/[" ]//g' | tr , '\n' | sed -e 's/:/\\:/g'))
_describe 'command or script' _commands -- _global_commands -- scripts _describe 'command or script' _commands -- _global_commands -- scripts
} }
_yarn_scripts() { _yarn_scripts() {
local -a scripts local -a scripts
scripts=($(yarn run --json 2>/dev/null | sed -E '/Commands available|possibleCommands/!d;s/.*Commands available from binary scripts: ([^"]+)".*/\1/;s/.*"items":\[([^]]+).*/\1/;s/[" ]//g' | tr , '\n')) scripts=($(yarn run --json 2>/dev/null | sed -E '/Commands available|possibleCommands/!d;s/.*Commands available from binary scripts: ([^"]+)".*/\1/;s/.*"items":\[([^]]+).*/\1/;s/[" ]//g' | tr , '\n' | sed -e 's/:/\\:/g'))
_describe 'script' scripts _describe 'script' scripts
} }

View File

@@ -38,21 +38,20 @@
# #
# ------------------------------------------------------------------------------ # ------------------------------------------------------------------------------
_zcash-cli() { local state line curcontext="$curcontext" ret=1
local context state line curcontext="$curcontext"
_arguments -C \ _arguments -C \
-?'[This help message]' \ '-?[display usage information]' \
-conf='[Specify configuration file (default: zcash.conf)]:PATH:_files' \ -conf='[specify configuration file]:file [zcash.conf]:_files' \
-datadir='[Specify data directory]:PATH:_directories' \ -datadir='[specify data directory]:directory:_directories' \
-testnet'[Use the test network]' \ -testnet'[use the test network]' \
-regtest'[Enter regression test mode, which uses a special chain in which blocks can be solved instantly. This is intended for regression testing tools and app development.]' \ -regtest'[enter regression test mode, which uses a special chain in which blocks can be solved instantly. This is intended for regression testing tools and app development.]' \
-rpcconnect='[Send commands to node running on <ip> (default: 127.0.0.1)]:RPCCONNECT:_hosts' \ -rpcconnect='[send commands to node running on specified ip]:rpcconnect [127.0.0.1]:_hosts' \
-rpcport='[Connect to JSON-RPC on <port> (default: 8232 or testnet: 18232)]: :_guard "[[\:digit\:]]#" "PORT"' \ -rpcport='[connect to JSON-RPC on specified port]: :_guard "[[\:digit\:]]#" "port [8232 or testnet\: 18232]"' \
-rpcwait'[Wait for RPC server to start]' \ -rpcwait'[wait for RPC server to start]' \
-rpcuser='[Username for JSON-RPC connections]:RPCUSER:()' \ -rpcuser='[username for JSON-RPC connections]:rpcuser' \
-rpcpassword='[Password for JSON-RPC connections]:RPCPASSWORD:()' \ -rpcpassword='[password for JSON-RPC connections]:rpcpassword' \
-rpcclienttimeout='[Timeout in seconds during HTTP requests, or 0 for no timeout. (default: 900)]: :_guard "[[\:digit\:]]#" "RPCCLIENTTIMEOUT"' \ -rpcclienttimeout='[specify timeout during HTTP requests, or 0 for no timeout]: :_guard "[[\:digit\:]]#" "timeout (seconds) [900]"' \
':subcommand:->subcommand' && ret=0 ':subcommand:->subcommand' && ret=0
case $state in case $state in
@@ -167,13 +166,11 @@ _zcash-cli() {
'zcsamplejoinsplit' 'zcsamplejoinsplit'
) )
_describe -t subcommands 'zcash-cli subcommands' subcommands && ret=0 _describe -t subcommands 'zcash-cli subcommand' subcommands && ret=0
;;
esac esac
return ret return ret
}
_zcash-cli "$@"
# Local Variables: # Local Variables:
# mode: Shell-Script # mode: Shell-Script

View File

@@ -6,7 +6,7 @@ with examples, so that you can learn how to write more advanced completion funct
give enough information and examples to get you up and running. If you need more details you can look it up for yourself in the give enough information and examples to get you up and running. If you need more details you can look it up for yourself in the
[[http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-System][official documentation]]. [[http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-System][official documentation]].
Please make any scripts that you create publically available for others (e.g. by forking this repo and making a [[id:64bcd501-b0f0-48c7-b8e2-07af708b95ec][pull request]]). Please make any scripts that you create publicly available for others (e.g. by forking this repo and making a [[id:64bcd501-b0f0-48c7-b8e2-07af708b95ec][pull request]]).
Also if you have any more information to add or improvements to make to this tutorial, please do. Also if you have any more information to add or improvements to make to this tutorial, please do.
* Getting started * Getting started
** Telling zsh which function to use for completing a command ** Telling zsh which function to use for completing a command
@@ -78,7 +78,7 @@ Examples of how to use these functions are given in the next section.
*** main utility functions for overall completion *** main utility functions for overall completion
| _alternative | Can be used to generate completion candidates from other utility functions or shell code. | | _alternative | Can be used to generate completion candidates from other utility functions or shell code. |
| _arguments | Used to specify how to complete individual options & arguments for a command with unix style options. | | _arguments | Used to specify how to complete individual options & arguments for a command with unix style options. |
| _describe | Used for creating simple completions consisting of single words with descriptions (but no actions). Easier to use than _arguments | | _describe | Used for creating simple completions consisting of words with descriptions (but no actions). Easier to use than _arguments |
| _gnu_generic | Can be used to complete options for commands that understand the `--help' option. | | _gnu_generic | Can be used to complete options for commands that understand the `--help' option. |
| _regex_arguments | Creates a function for matching commandline arguments with regular expressions, and then performing actions/completions. | | _regex_arguments | Creates a function for matching commandline arguments with regular expressions, and then performing actions/completions. |
*** functions for performing complex completions of single words *** functions for performing complex completions of single words
@@ -86,6 +86,7 @@ Examples of how to use these functions are given in the next section.
| _combination | Used to complete combinations of values, for example pairs of hostnames and usernames. | | _combination | Used to complete combinations of values, for example pairs of hostnames and usernames. |
| _multi_parts | Used for completing multiple parts of words separately where each part is separated by some char, e.g. for completing partial filepaths: /u/i/sy -> /usr/include/sys | | _multi_parts | Used for completing multiple parts of words separately where each part is separated by some char, e.g. for completing partial filepaths: /u/i/sy -> /usr/include/sys |
| _sep_parts | Like _multi_parts but allows different separators at different parts of the completion. | | _sep_parts | Like _multi_parts but allows different separators at different parts of the completion. |
| _sequence | Used as a wrapper around another completion function to complete a delimited list of matches generated by that other function.
*** functions for completing specific types of objects *** functions for completing specific types of objects
| _path_files | Used to complete filepaths. Take several options to control behaviour. | | _path_files | Used to complete filepaths. Take several options to control behaviour. |
| _files | Calls _path_files with all options except -g and -/. These options depend on file-patterns style setting. | | _files | Calls _path_files with all options except -g and -/. These options depend on file-patterns style setting. |
@@ -119,72 +120,76 @@ _regex_arguments or _alternative functions.
** Writing simple completion functions using _describe ** Writing simple completion functions using _describe
The _describe function can be used for simple completions where the order and position of the options/arguments is The _describe function can be used for simple completions where the order and position of the options/arguments is
not important. You just need to create an array parameter to hold the options & their descriptions, and then pass not important. You just need to create an array parameter to hold the options & their descriptions, and then pass
the parameter name as an argument to _describe. The following example creates completion candidates -c and -d, with the parameter name as an argument to _describe. The following example creates completion candidates c and d, with
the descriptions (note this should be put in a file called _cmd in some directory listed in $fpath). the descriptions (note this should be put in a file called _cmd in some directory listed in $fpath).
#+BEGIN_SRC sh #+BEGIN_SRC sh
#compdef cmd #compdef cmd
local -a options local -a subcmds
options=('-c:description for -c opt' '-d:description for -d opt') subcmds=('c:description for c command' 'd:description for d command')
_describe 'values' options _describe 'command' subcmds
#+END_SRC #+END_SRC
You can use several different lists separated by a double hyphen e.g. like this: You can use several different lists separated by a double hyphen as follows but note that this mixes the matches under and single heading and is not intended to be used with different types of completion candidates:
#+BEGIN_SRC sh #+BEGIN_SRC sh
local -a options arguments local -a subcmds topics
options=('-c:description for -c opt' '-d:description for -d opt') subcmds=('c:description for c command' 'd:description for d command')
arguments=('e:description for e arg' 'f:description for f arg') topics=('e:description for e help topic' 'f:description for f help topic')
_describe 'values' options -- arguments _describe 'command' subcmds -- topics
#+END_SRC #+END_SRC
If two candidates have the same description, _describe collects them together on the same rowand ensures that descriptions are aligned in neatedly in columns.
The _describe function can be used in an ACTION as part of a specification for _alternative, _arguments or _regex_arguments. The _describe function can be used in an ACTION as part of a specification for _alternative, _arguments or _regex_arguments.
In this case you will have to put it in braces with its arguments, e.g. 'TAG:DESCRIPTION:{_describe 'values' options}' In this case you will have to put it in braces with its arguments, e.g. 'TAG:DESCRIPTION:{_describe 'values' options}'
** Writing completion functions using _alternative ** Writing completion functions using _alternative
Like _describe, this function performs simple completions where the order and position of options/arguments is not important. Like _describe, this function performs simple completions where the order and position of options/arguments is not important.
However, unlike _describe, you can call execute shell code or call functions to obtain the completion candidates. However, unlike _describe, instead of fixed matches further functions may be called to generate the completion candidates. Furthermore, _alternative allows a mix of different types of completion candidates to be mixed.
As arguments it takes a list of specifications each in the form 'TAG:DESCRIPTION:ACTION' where TAG is a tag name, As arguments it takes a list of specifications each in the form 'TAG:DESCRIPTION:ACTION' where TAG is a special tag that identifies the type of completion matches,
DESCRIPTION is a description, and ACTION is one of the action types listed previously (apart from the ->STRING and =ACTION forms). DESCRIPTION is used as a heading to describe the group of completion candidates collectively, and ACTION is one of the action types listed previously (apart from the ->STRING and =ACTION forms).
For example: For example:
#+BEGIN_SRC sh #+BEGIN_SRC sh
_alternative 'args:custom args:(a b c)' 'files:filenames:_files' _alternative 'arguments:custom arg:(a b c)' 'files:filename:_files'
#+END_SRC #+END_SRC
The first specification adds completion candidates a, b & c, and the second specification calls the _files function The first specification adds completion candidates a, b & c, and the second specification calls the _files function for completing filepaths.
for completing filepaths.
We could split the specifications over several lines with \ and add descriptions to each of the custom args like this: We could split the specifications over several lines with \ and add descriptions to each of the custom args like this:
#+BEGIN_SRC sh #+BEGIN_SRC sh
_alternative 'args:custom args:((a\:"description a" b\:"description b" c\:"description c"))'\ _alternative \
'files:filenames:_files' 'args:custom arg:((a\:"description a" b\:"description b" c\:"description c"))' \
'files:filename:_files'
#+END_SRC #+END_SRC
If we want to call _files with arguments we can put it in braces, like this: If we want to pass arguments to _files they can simply be included, like this:
#+BEGIN_SRC sh #+BEGIN_SRC sh
_alternative 'args:custom args:((a\:"description a" b\:"description b" c\:"description c"))'\ _alternative \
'files:filenames:{_files -/}' 'args:custom arg:((a\:"description a" b\:"description b" c\:"description c"))'\
'files:filename:_files -/'
#+END_SRC #+END_SRC
To use parameter expansion to create our list of completions we must use double quotes to quote the specifications, To use parameter expansion to create our list of completions we must use double quotes to quote the specifications,
e.g: e.g:
#+BEGIN_SRC sh #+BEGIN_SRC sh
_alternative "dirs:user directories:($userdirs)"\ _alternative \
"pids:process IDs:($(ps -A o pid=))" "dirs:user directory:($userdirs)" \
"pids:process ID:($(ps -A o pid=))"
#+END_SRC #+END_SRC
In this case the first specification adds the words stored in the $userdirs variable, and the second specification In this case the first specification adds the words stored in the $userdirs variable, and the second specification
evaluates 'ps -A o pid=' to get a list of pids to use as completion candidates. evaluates 'ps -A o pid=' to get a list of pids to use as completion candidates. In practice, we would make used of the existing _pids function for this.
We can use other utility functions such as _values in the ACTION to perform more complex completions, e.g: We can use other utility functions such as _values in the ACTION to perform more complex completions, e.g:
#+BEGIN_SRC sh #+BEGIN_SRC sh
_alternative "dirs:user directories:($userdirs)"\ _alternative \
'opts:comma separated opts:{_values -s , a b c}' "directories:user directory:($userdirs)" \
'options:comma-separated opt: _values -s , letter a b c'
#+END_SRC #+END_SRC
this will complete the items in $userdirs, aswell as a comma separated list containing a, b &/or c. this will complete the items in $userdirs, as well as a comma separated list containing a, b &/or c. Note the use of the initial space before _values. This is needed because _values doesn't understand standard compadd options for descriptions.
As with _describe, the _alternative function can itself be used in an ACTION as part of a specification for _arguments As with _describe, the _alternative function can itself be used in an ACTION as part of a specification for _arguments
or _regex_arguments. or _regex_arguments.
** Writing completion functions using _arguments ** Writing completion functions using _arguments
With the _arguments function you can create more sophisticated completion functions. With a single call to the _arguments function you can create fairly sophisticated completion functions. It is intended to handle typical commands that take a variety of options along with some normal arguments.
Like the _alternative function, _arguments takes a list of specification strings as arguments. Like the _alternative function, _arguments takes a list of specification strings as arguments.
These specification strings can be for specifying options and any corresponding option arguments (e.g. -f filename), These specification strings specify options and any corresponding option arguments (e.g. -f filename),
or command arguments. or command arguments.
Basic option specifications take the form '-OPT[DESCRIPTION]', e.g. like this: Basic option specifications take the form '-OPT[DESCRIPTION]', e.g. like this:
@@ -227,11 +232,11 @@ In this case paths to music files are completed stepwise descending down directo
and the flags are completed as a comma separated list using the _values function. and the flags are completed as a comma separated list using the _values function.
I have just given you the basics of _arguments specifications here, you can also specify mutually exclusive options, I have just given you the basics of _arguments specifications here, you can also specify mutually exclusive options,
repeated options & arguments, options beginning with + insead of -, etc. For more details see the [[http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-System][official documentation]]. repeated options & arguments, options beginning with + instead of -, etc. For more details see the [[http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-System][official documentation]].
Also have a look at the tutorials mentioned at the end of this document, and the completion functions in the [[https://github.com/vapniks/zsh-completions/tree/master/src][src directory]]. Also have a look at the tutorials mentioned at the end of this document, and the completion functions in the [[https://github.com/vapniks/zsh-completions/tree/master/src][src directory]].
** Writing completion functions using _regex_arguments and _regex_words ** Writing completion functions using _regex_arguments and _regex_words
If you have a complex command line specification with several different possible argument sequences then If you have a complex command line specification with several different possible argument sequences then
the _regex_arguments function may be what you need. the _regex_arguments function may be what you need. It typically works well where you have a series of keywords followed by a variable number of arguments.
_regex_arguments creates a completion function whose name is given by the first argument. _regex_arguments creates a completion function whose name is given by the first argument.
Hence you need to first call _regex_arguments to create the completion function, and then call that function, Hence you need to first call _regex_arguments to create the completion function, and then call that function,
@@ -251,7 +256,7 @@ For example:
_regex_arguments _cmd SEQ1 '|' SEQ2 \( SEQ2a '|' SEQ2b \) _regex_arguments _cmd SEQ1 '|' SEQ2 \( SEQ2a '|' SEQ2b \)
_cmd "$@" _cmd "$@"
#+END_SRC #+END_SRC
this specifies a command line matching either SEQ1, or SEQ2 followed by SEQ2a or SEQ2b. This specifies a command line matching either SEQ1, or SEQ2 followed by SEQ2a or SEQ2b. You are describing the form arguments to the command take in the form of a regular expression grammar.
Each specification in a sequence must contain a / PATTERN/ part at the start followed by an optional ':TAG:DESCRIPTION:ACTION' Each specification in a sequence must contain a / PATTERN/ part at the start followed by an optional ':TAG:DESCRIPTION:ACTION'
part. part.
@@ -266,7 +271,7 @@ except that it has an extra : at the start, and now all of the possible ACTION f
Here is an example: Here is an example:
#+BEGIN_SRC sh #+BEGIN_SRC sh
_regex_arguments _hello /$'[^\0]##\0'/ \( /$'word1(a|b|c)\0'/ ':word:first word:(word1a word1b word1c)' '|'\ _regex_arguments _cmd /$'[^\0]##\0'/ \( /$'word1(a|b|c)\0'/ ':word:first word:(word1a word1b word1c)' '|'\
/$'word11(a|b|c)\0'/ ':word:first word:(word11a word11b word11c)' \( /$'word2(a|b|c)\0'/ ':word:second word:(word2a word2b word2c)'\ /$'word11(a|b|c)\0'/ ':word:first word:(word11a word11b word11c)' \( /$'word2(a|b|c)\0'/ ':word:second word:(word2a word2b word2c)'\
'|' /$'word22(a|b|c)\0'/ ':word:second word:(word22a word22b word22c)' \) \) '|' /$'word22(a|b|c)\0'/ ':word:second word:(word22a word22b word22c)' \) \)
_cmd "$@" _cmd "$@"
@@ -435,6 +440,6 @@ you can add another empty option (i.e. \:) to the ACTION like this ':TAG:DESCRIP
Note this only applies to utility functions that use ACTIONs in their specification arguments (_arguments, _regex_arguments, etc.) Note this only applies to utility functions that use ACTIONs in their specification arguments (_arguments, _regex_arguments, etc.)
* Other resources * Other resources
[[http://wikimatze.de/writing-zsh-completion-for-padrino.html][Here]] is a nicely formatted short tutorial showing basic usage of the _arguments function, [[https://wikimatze.de/writing-zsh-completion-for-padrino/][Here]] is a nicely formatted short tutorial showing basic usage of the _arguments function,
and [[http://www.linux-mag.com/id/1106/][here]] is a slightly more advanced tutorial using the _arguments function. and [[http://www.linux-mag.com/id/1106/][here]] is a slightly more advanced tutorial using the _arguments function.
[[http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-System][Here]] is the zshcompsys man page. [[http://zsh.sourceforge.net/Doc/Release/Completion-System.html#Completion-System][Here]] is the zshcompsys man page.

View File

@@ -99,7 +99,7 @@ zstyle ':completion:*:history-words' remove-all-dups yes
zstyle ':completion:*:history-words' list false zstyle ':completion:*:history-words' list false
zstyle ':completion:*:history-words' menu yes zstyle ':completion:*:history-words' menu yes
# Environmental Variables # Environment Variables
zstyle ':completion::*:(-command-|export):*' fake-parameters ${${${_comps[(I)-value-*]#*,}%%,*}:#-*-} zstyle ':completion::*:(-command-|export):*' fake-parameters ${${${_comps[(I)-value-*]#*,}%%,*}:#-*-}
# Populate hostname completion. But allow ignoring custom entries from static # Populate hostname completion. But allow ignoring custom entries from static

View File

@@ -1,7 +1,17 @@
Editor Editor
====== ======
Sets key bindings. Sets editor specific key bindings options and variables.
Options
-------
- `BEEP` beep on error in line editor.
Variables
---------
- `WORDCHARS` treat a given set of characters as part of a word.
Settings Settings
-------- --------

View File

@@ -5,9 +5,6 @@ Sets general shell options and defines environment variables.
This module must be loaded first. This module must be loaded first.
Environment Variables
---------------------
Contributors Contributors
------------ ------------
@@ -15,6 +12,39 @@ This module **MUST NOT** rely on any command not built in Zsh.
Non-interactive environment variables should be defined in [`zshenv`][1]. Non-interactive environment variables should be defined in [`zshenv`][1].
Options
-------
### General
- `COMBINING_CHARS` combine zero-length punctuation characters (accents) with
the base character.
- `INTERACTIVE_COMMENTS` enable comments in interactive shell.
- `RC_QUOTES` allow 'Henry''s Garage' instead of 'Henry'\''s Garage'.
- `MAIL_WARNING` don't print a warning message if a mail file has been accessed.
### Jobs
- `LONG_LIST_JOBS` list jobs in the long format by default.
- `AUTO_RESUME` attempt to resume existing job before creating a new process.
- `NOTIFY` report status of background jobs immediately.
- `BG_NICE` don't run all background jobs at a lower priority.
- `HUP` don't kill jobs on shell exit.
- `CHECK_JOBS` don't report on jobs when shell exit.
Variables
---------
### Termcap
- `LESS_TERMCAP_mb` begins blinking.
- `LESS_TERMCAP_md` begins bold.
- `LESS_TERMCAP_me` ends mode.
- `LESS_TERMCAP_se` ends standout-mode.
- `LESS_TERMCAP_so` begins standout-mode.
- `LESS_TERMCAP_ue` ends underline.
- `LESS_TERMCAP_us` begins underline.
Authors Authors
------- -------

View File

@@ -14,7 +14,7 @@
# paste had a regression. Additionally, 5.2 added bracketed-paste-url-magic # paste had a regression. Additionally, 5.2 added bracketed-paste-url-magic
# which is generally better than url-quote-magic so we load that when possible. # which is generally better than url-quote-magic so we load that when possible.
autoload -Uz is-at-least autoload -Uz is-at-least
if [[ ${ZSH_VERSION} != 5.1.1 ]]; then if [[ ${ZSH_VERSION} != 5.1.1 && ${TERM} != "dumb" ]]; then
if is-at-least 5.2; then if is-at-least 5.2; then
autoload -Uz bracketed-paste-url-magic autoload -Uz bracketed-paste-url-magic
zle -N bracketed-paste bracketed-paste-url-magic zle -N bracketed-paste bracketed-paste-url-magic

View File

@@ -212,12 +212,19 @@ function git-info {
# Format stashed. # Format stashed.
zstyle -s ':prezto:module:git:info:stashed' format 'stashed_format' zstyle -s ':prezto:module:git:info:stashed' format 'stashed_format'
if [[ -n "$stashed_format" && -f "$(git-dir)/refs/stash" ]]; then if [[ -n "$stashed_format" ]]; then
commondir=""
if [[ -f "$(git-dir)/commondir" ]]; then
commondir="$(<$(git-dir)/commondir)"
[[ "$commondir" =~ ^/ ]] || commondir="$(git-dir)/$commondir"
fi
if [[ -f "$(git-dir)/refs/stash" || ( -n "$commondir" && -f "$commondir/refs/stash" ) ]]; then
stashed="$(command git stash list 2> /dev/null | wc -l | awk '{print $1}')" stashed="$(command git stash list 2> /dev/null | wc -l | awk '{print $1}')"
if [[ -n "$stashed" ]]; then if [[ -n "$stashed" ]]; then
zformat -f stashed_formatted "$stashed_format" "S:$stashed" zformat -f stashed_formatted "$stashed_format" "S:$stashed"
fi fi
fi fi
fi
# Format action. # Format action.
zstyle -s ':prezto:module:git:info:action' format 'action_format' zstyle -s ':prezto:module:git:info:action' format 'action_format'

View File

@@ -17,6 +17,27 @@ Requirements
* [ZSH](http://zsh.sourceforge.net) 4.3 or newer * [ZSH](http://zsh.sourceforge.net) 4.3 or newer
Install
------------------------------------------------------------------------------
Using the [Homebrew]( https://brew.sh ) package manager:
brew install zsh-history-substring-search
echo 'source /usr/local/share/zsh-history-substring-search/zsh-history-substring-search.zsh' >> ~/.zshrc
Using [Oh-my-zsh](https://github.com/robbyrussell/oh-my-zsh):
1. Clone this repository in oh-my-zsh's plugins directory:
git clone https://github.com/zsh-users/zsh-history-substring-search ${ZSH_CUSTOM:-~/.oh-my-zsh/custom}/plugins/zsh-history-substring-search
2. Activate the plugin in `~/.zshrc`:
plugins=( [plugins...] history-substring-search)
3. Source `~/.zshrc` to take changes into account:
source ~/.zshrc
Usage Usage
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
@@ -35,6 +56,9 @@ Usage
Users typically bind their UP and DOWN arrow keys to this script, thus: Users typically bind their UP and DOWN arrow keys to this script, thus:
* Run `cat -v` in your favorite terminal emulator to observe key codes. * Run `cat -v` in your favorite terminal emulator to observe key codes.
     (**NOTE:** In some cases, `cat -v` shows the wrong key codes. If the
key codes shown by `cat -v` don't work for you, press `<C-v><UP>` and
`<C-v><DOWN>` at your ZSH command line prompt for correct key codes.)
* Press the UP arrow key and observe what is printed in your terminal. * Press the UP arrow key and observe what is printed in your terminal.
* Press the DOWN arrow key and observe what is printed in your terminal. * Press the DOWN arrow key and observe what is printed in your terminal.
* Press the Control and C keys simultaneously to terminate the `cat -v`. * Press the Control and C keys simultaneously to terminate the `cat -v`.
@@ -107,6 +131,11 @@ default values only after having loaded this script into your ZSH session.
Flags" section in the zshexpn(1) man page to learn about the kinds of Flags" section in the zshexpn(1) man page to learn about the kinds of
values you may assign to this variable. values you may assign to this variable.
* `HISTORY_SUBSTRING_SEARCH_FUZZY` is a global variable that defines
how the command history will be searched for your query. If set to a non-empty
value, causes this script to perform a fuzzy search by words, matching in
given order e.g. `ab c` will match `*ab*c*`
* `HISTORY_SUBSTRING_SEARCH_ENSURE_UNIQUE` is a global variable that defines * `HISTORY_SUBSTRING_SEARCH_ENSURE_UNIQUE` is a global variable that defines
whether all search results returned are _unique_. If set to a non-empty whether all search results returned are _unique_. If set to a non-empty
value, then only unique search results are presented. This behaviour is off value, then only unique search results are presented. This behaviour is off

View File

@@ -0,0 +1,2 @@
0=${(%):-%N}
source ${0:A:h}/zsh-history-substring-search.zsh

View File

@@ -7,6 +7,7 @@
# Copyright (c) 2011 Sorin Ionescu # Copyright (c) 2011 Sorin Ionescu
# Copyright (c) 2011 Vincent Guerci # Copyright (c) 2011 Vincent Guerci
# Copyright (c) 2016 Geza Lore # Copyright (c) 2016 Geza Lore
# Copyright (c) 2017 Bengt Brodersen
# All rights reserved. # All rights reserved.
# #
# Redistribution and use in source and binary forms, with or without # Redistribution and use in source and binary forms, with or without
@@ -39,34 +40,30 @@
############################################################################## ##############################################################################
#----------------------------------------------------------------------------- #-----------------------------------------------------------------------------
# declare global variables # declare global configuration variables
#-----------------------------------------------------------------------------
typeset -g BUFFER MATCH MBEGIN MEND CURSOR
typeset -g HISTORY_SUBSTRING_SEARCH_HIGHLIGHT_FOUND
typeset -g HISTORY_SUBSTRING_SEARCH_HIGHLIGHT_NOT_FOUND
typeset -g HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS
typeset -g HISTORY_SUBSTRING_SEARCH_ENSURE_UNIQUE
typeset -g _history_substring_search_refresh_display
typeset -g _history_substring_search_query_highlight
typeset -g _history_substring_search_result
typeset -g _history_substring_search_query
typeset -g -A _history_substring_search_raw_matches
typeset -g _history_substring_search_raw_match_index
typeset -g -A _history_substring_search_matches
typeset -g -A _history_substring_search_unique_filter
typeset -g _history_substring_search_match_index
#-----------------------------------------------------------------------------
# configuration variables
#----------------------------------------------------------------------------- #-----------------------------------------------------------------------------
typeset -g HISTORY_SUBSTRING_SEARCH_HIGHLIGHT_FOUND='bg=magenta,fg=white,bold' typeset -g HISTORY_SUBSTRING_SEARCH_HIGHLIGHT_FOUND='bg=magenta,fg=white,bold'
typeset -g HISTORY_SUBSTRING_SEARCH_HIGHLIGHT_NOT_FOUND='bg=red,fg=white,bold' typeset -g HISTORY_SUBSTRING_SEARCH_HIGHLIGHT_NOT_FOUND='bg=red,fg=white,bold'
typeset -g HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS='i' typeset -g HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS='i'
typeset -g HISTORY_SUBSTRING_SEARCH_ENSURE_UNIQUE='' typeset -g HISTORY_SUBSTRING_SEARCH_ENSURE_UNIQUE=''
typeset -g _history_substring_search_{refresh_display,query_highlight,result,query,match_index,raw_match_index} typeset -g HISTORY_SUBSTRING_SEARCH_FUZZY=''
typeset -ga _history_substring_search{,_raw}_matches
#-----------------------------------------------------------------------------
# declare internal global variables
#-----------------------------------------------------------------------------
typeset -g BUFFER MATCH MBEGIN MEND CURSOR
typeset -g _history_substring_search_refresh_display
typeset -g _history_substring_search_query_highlight
typeset -g _history_substring_search_result
typeset -g _history_substring_search_query
typeset -g -a _history_substring_search_query_parts
typeset -g -a _history_substring_search_raw_matches
typeset -g -i _history_substring_search_raw_match_index
typeset -g -a _history_substring_search_matches
typeset -g -i _history_substring_search_match_index
typeset -g -A _history_substring_search_unique_filter
#----------------------------------------------------------------------------- #-----------------------------------------------------------------------------
# the main ZLE widgets # the main ZLE widgets
@@ -223,6 +220,7 @@ _history-substring-search-begin() {
# speed things up a little. # speed things up a little.
# #
_history_substring_search_query= _history_substring_search_query=
_history_substring_search_query_parts=()
_history_substring_search_raw_matches=() _history_substring_search_raw_matches=()
else else
@@ -233,20 +231,31 @@ _history-substring-search-begin() {
_history_substring_search_query=$BUFFER _history_substring_search_query=$BUFFER
# #
# $BUFFER contains the text that is in the command-line currently. # compose search pattern
# we put an extra "\\" before meta characters such as "\(" and "\)",
# so that they become "\\\(" and "\\\)".
# #
local escaped_query=${BUFFER//(#m)[\][()|\\*?#<>~^]/\\$MATCH} if [[ -n $HISTORY_SUBSTRING_SEARCH_FUZZY ]]; then
#
# `=` split string in arguments
#
_history_substring_search_query_parts=(${=_history_substring_search_query})
else
_history_substring_search_query_parts=(${_history_substring_search_query})
fi
# #
# Find all occurrences of the search query in the history file. # Escape and join query parts with wildcard character '*' as seperator
# `(j:CHAR:)` join array to string with CHAR as seperator
#
local search_pattern="*${(j:*:)_history_substring_search_query_parts[@]//(#m)[\][()|\\*?#<>~^]/\\$MATCH}*"
#
# Find all occurrences of the search pattern in the history file.
# #
# (k) returns the "keys" (history index numbers) instead of the values # (k) returns the "keys" (history index numbers) instead of the values
# (R) returns values in reverse older, so the index of the youngest # (R) returns values in reverse older, so the index of the youngest
# matching history entry is at the head of the list. # matching history entry is at the head of the list.
# #
_history_substring_search_raw_matches=(${(k)history[(R)(#$HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS)*${escaped_query}*]}) _history_substring_search_raw_matches=(${(k)history[(R)(#$HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS)${search_pattern}]})
fi fi
# #
@@ -265,8 +274,7 @@ _history-substring-search-begin() {
# #
_history_substring_search_raw_match_index=0 _history_substring_search_raw_match_index=0
_history_substring_search_matches=() _history_substring_search_matches=()
unset _history_substring_search_unique_filter _history_substring_search_unique_filter=()
typeset -A -g _history_substring_search_unique_filter
# #
# If $_history_substring_search_match_index is equal to # If $_history_substring_search_match_index is equal to
@@ -309,16 +317,20 @@ _history-substring-search-end() {
_zsh_highlight _zsh_highlight
# highlight the search query inside the command line # highlight the search query inside the command line
if [[ -n $_history_substring_search_query_highlight && -n $_history_substring_search_query ]]; then if [[ -n $_history_substring_search_query_highlight ]]; then
# # highlight first matching query parts
# The following expression yields a variable $MBEGIN, which local highlight_start_index=0
# indicates the begin position + 1 of the first occurrence local highlight_end_index=0
# of _history_substring_search_query in $BUFFER. for query_part in $_history_substring_search_query_parts; do
# local escaped_query_part=${query_part//(#m)[\][()|\\*?#<>~^]/\\$MATCH}
: ${(S)BUFFER##(#m$HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS)($_history_substring_search_query##)} # (i) get index of pattern
local begin=$(( MBEGIN - 1 )) local query_part_match_index=${${BUFFER:$highlight_start_index}[(i)(#$HISTORY_SUBSTRING_SEARCH_GLOBBING_FLAGS)${escaped_query_part}]}
local end=$(( begin + $#_history_substring_search_query )) if [[ $query_part_match_index -le ${#BUFFER:$highlight_start_index} ]]; then
region_highlight+=("$begin $end $_history_substring_search_query_highlight") highlight_start_index=$(( $highlight_start_index + $query_part_match_index ))
highlight_end_index=$(( $highlight_start_index + ${#query_part} ))
region_highlight+=("$(($highlight_start_index - 1)) $(($highlight_end_index - 1)) $_history_substring_search_query_highlight")
fi
done
fi fi
# For debugging purposes: # For debugging purposes:
@@ -447,7 +459,7 @@ _history_substring_search_process_raw_matches() {
# #
# Move on to the next raw entry and get its history index. # Move on to the next raw entry and get its history index.
# #
(( _history_substring_search_raw_match_index++ )) _history_substring_search_raw_match_index+=1
local index=${_history_substring_search_raw_matches[$_history_substring_search_raw_match_index]} local index=${_history_substring_search_raw_matches[$_history_substring_search_raw_match_index]}
# #
@@ -627,7 +639,7 @@ _history-substring-search-up-search() {
# 1. Move index to point to the next match. # 1. Move index to point to the next match.
# 2. Update display to indicate search found. # 2. Update display to indicate search found.
# #
(( _history_substring_search_match_index++ )) _history_substring_search_match_index+=1
_history-substring-search-found _history-substring-search-found
else else
@@ -638,7 +650,7 @@ _history-substring-search-up-search() {
# _history_substring_search_matches. # _history_substring_search_matches.
# 2. Update display to indicate search not found. # 2. Update display to indicate search not found.
# #
(( _history_substring_search_match_index++ )) _history_substring_search_match_index+=1
_history-substring-search-not-found _history-substring-search-not-found
fi fi
@@ -707,7 +719,7 @@ _history-substring-search-down-search() {
# 1. Move index to point to the previous match. # 1. Move index to point to the previous match.
# 2. Update display to indicate search found. # 2. Update display to indicate search found.
# #
(( _history_substring_search_match_index-- )) _history_substring_search_match_index+=-1
_history-substring-search-found _history-substring-search-found
else else
@@ -718,7 +730,7 @@ _history-substring-search-down-search() {
# _history_substring_search_matches. # _history_substring_search_matches.
# 2. Update display to indicate search not found. # 2. Update display to indicate search not found.
# #
(( _history_substring_search_match_index-- )) _history_substring_search_match_index+=-1
_history-substring-search-not-found _history-substring-search-not-found
fi fi

View File

@@ -3,13 +3,6 @@ History
Sets [history][1] options and defines history aliases. Sets [history][1] options and defines history aliases.
Variables
---------
- `HISTFILE` stores the path to the history file.
- `HISTSIZE` stores the maximum number of events to save in the internal history.
- `SAVEHIST` stores the maximum number of events to save in the history file.
Options Options
------- -------
@@ -26,6 +19,13 @@ Options
- `HIST_VERIFY` does not execute immediately upon history expansion. - `HIST_VERIFY` does not execute immediately upon history expansion.
- `HIST_BEEP` beeps when accessing non-existent history. - `HIST_BEEP` beeps when accessing non-existent history.
Variables
---------
- `HISTFILE` stores the path to the history file.
- `HISTSIZE` stores the maximum number of events to save in the internal history.
- `SAVEHIST` stores the maximum number of events to save in the history file.
Aliases Aliases
------- -------

View File

@@ -6,14 +6,6 @@
# Sorin Ionescu <sorin.ionescu@gmail.com> # Sorin Ionescu <sorin.ionescu@gmail.com>
# #
#
# Variables
#
HISTFILE="${ZDOTDIR:-$HOME}/.zhistory" # The path to the history file.
HISTSIZE=10000 # The maximum number of events to save in the internal history.
SAVEHIST=10000 # The maximum number of events to save in the history file.
# #
# Options # Options
# #
@@ -31,6 +23,14 @@ setopt HIST_SAVE_NO_DUPS # Do not write a duplicate event to the history
setopt HIST_VERIFY # Do not execute immediately upon history expansion. setopt HIST_VERIFY # Do not execute immediately upon history expansion.
setopt HIST_BEEP # Beep when accessing non-existent history. setopt HIST_BEEP # Beep when accessing non-existent history.
#
# Variables
#
HISTFILE="${ZDOTDIR:-$HOME}/.zhistory" # The path to the history file.
HISTSIZE=10000 # The maximum number of events to save in the internal history.
SAVEHIST=10000 # The maximum number of events to save in the history file.
# #
# Aliases # Aliases
# #

View File

@@ -1,7 +1,16 @@
Homebrew Homebrew
======== ========
Defines Homebrew aliases. Defines Homebrew specific environment variables and aliases.
Variables
---------
Execute the following to list the environment variables loaded in the shell:
```sh
brew shellenv
```
Aliases Aliases
------- -------
@@ -20,12 +29,9 @@ Aliases
### Homebrew Cask ### Homebrew Cask
- `cask` is aliased to `brew cask`. - `cask` is aliased to `brew cask`.
- `caskc` cleans up old cached downloads.
- `caskC` cleans up all cached downloads.
- `caski` installs a cask. - `caski` installs a cask.
- `caskl` lists installed casks. - `caskl` lists installed casks.
- `casko` lists casks which have an update available. - `casko` lists casks which have an update available.
- `casks` searches for a cask.
- `caskx` uninstalls a cask. - `caskx` uninstalls a cask.
Authors Authors

View File

@@ -10,6 +10,16 @@ if [[ "$OSTYPE" != (darwin|linux)* ]]; then
return 1 return 1
fi fi
#
# Variables
#
# Load standard Homebrew shellenv into the shell session.
# `brew shellenv` is relatively new, guard for legacy Homebrew.
if (( $+commands[brew] )); then
eval "$(brew shellenv 2> /dev/null)"
fi
# #
# Aliases # Aliases
# #
@@ -21,15 +31,26 @@ alias brewi='brew install'
alias brewl='brew list' alias brewl='brew list'
alias brewo='brew outdated' alias brewo='brew outdated'
alias brews='brew search' alias brews='brew search'
alias brewu='brew update && brew upgrade' alias brewu='brew upgrade'
alias brewx='brew remove' alias brewx='brew uninstall'
# Homebrew Cask # Homebrew Cask
alias cask='brew cask' alias cask='brew cask'
alias caskc='brew cask cleanup --outdated' alias caskc='hb_deprecated brew cask cleanup'
alias caskC='brew cask cleanup' alias caskC='hb_deprecated brew cask cleanup'
alias caski='brew cask install' alias caski='brew cask install'
alias caskl='brew cask list' alias caskl='brew cask list'
alias casko='brew cask outdated' alias casko='brew cask outdated'
alias casks='brew cask search' alias casks='hb_deprecated brew cask search'
alias caskx='brew cask uninstall' alias caskx='brew cask uninstall'
function hb_deprecated {
local cmd="${argv[3]}"
local cmd_args=( ${(@)argv:4} )
printf "'brew cask %s' has been deprecated, " "${cmd}"
printf "using 'brew %s' instead\n" "${cmd}"
cmd_args=( ${(@)argv:4} )
command brew "${cmd}" ${(@)cmd_args}
}

View File

@@ -29,6 +29,53 @@ To test if your terminal and font support it, check that all the necessary chara
![Screenshot](https://gist.githubusercontent.com/agnoster/3712874/raw/screenshot.png) ![Screenshot](https://gist.githubusercontent.com/agnoster/3712874/raw/screenshot.png)
## Customize your prompt view
By default prompt has these segments: `prompt_status`, `prompt_context`, `prompt_virtualenv`, `prompt_dir`, `prompt_git`, `prompt_end` in that particular order.
If you want to add, change the order or remove some segments of the prompt, you can use array environment variable named `AGNOSTER_PROMPT_SEGMENTS`.
Examples:
- Show all segments of the prompt with indices:
```
echo "${(F)AGNOSTER_PROMPT_SEGMENTS[@]}" | cat -n
```
- Add the new segment of the prompt to the beginning:
```
AGNOSTER_PROMPT_SEGMENTS=("prompt_git" "${AGNOSTER_PROMPT_SEGMENTS[@]}")
```
- Add the new segment of the prompt to the end:
```
AGNOSTER_PROMPT_SEGMENTS+="prompt_end"
```
- Insert the new segment of the prompt = `PROMPT_SEGMENT_NAME` on the particular position = `PROMPT_SEGMENT_POSITION`:
```
PROMPT_SEGMENT_POSITION=5 PROMPT_SEGMENT_NAME="prompt_end";\
AGNOSTER_PROMPT_SEGMENTS=("${AGNOSTER_PROMPT_SEGMENTS[@]:0:$PROMPT_SEGMENT_POSITION-1}" "$PROMPT_SEGMENT_NAME" "${AGNOSTER_PROMPT_SEGMENTS[@]:$PROMPT_SEGMENT_POSITION-1}");\
unset PROMPT_SEGMENT_POSITION PROMPT_SEGMENT_NAME
```
- Swap segments 4th and 5th:
```
SWAP_SEGMENTS=(4 5);\
TMP_VAR="$AGNOSTER_PROMPT_SEGMENTS[$SWAP_SEGMENTS[1]]"; AGNOSTER_PROMPT_SEGMENTS[$SWAP_SEGMENTS[1]]="$AGNOSTER_PROMPT_SEGMENTS[$SWAP_SEGMENTS[2]]"; AGNOSTER_PROMPT_SEGMENTS[$SWAP_SEGMENTS[2]]="$TMP_VAR"
unset SWAP_SEGMENTS TMP_VAR
```
- Remove the 5th segment:
```
AGNOSTER_PROMPT_SEGMENTS[5]=
```
A small demo of the dummy custom prompt segment, which has been created with help of the built-in `prompt_segment()` function from Agnoster theme:
```
# prompt_segment() - Takes two arguments, background and foreground.
# Both can be omitted, rendering default background/foreground.
customize_agnoster() {
prompt_segment 'red' '' ' ⚙ ⚡⚡⚡ ⚙ '
}
```
![Customization demo](https://github.com/apodkutin/agnoster-zsh-theme/raw/customize-prompt/agnoster_customization.gif)
## Future Work ## Future Work
I don't want to clutter it up too much, but I am toying with the idea of adding RVM (ruby version) and n (node.js version) display. I don't want to clutter it up too much, but I am toying with the idea of adding RVM (ruby version) and n (node.js version) display.

View File

@@ -22,6 +22,17 @@
# jobs are running in this shell will all be displayed automatically when # jobs are running in this shell will all be displayed automatically when
# appropriate. # appropriate.
### Segments of the prompt, default order declaration
typeset -aHg AGNOSTER_PROMPT_SEGMENTS=(
prompt_status
prompt_context
prompt_virtualenv
prompt_dir
prompt_git
prompt_end
)
### Segment drawing ### Segment drawing
# A few utility functions to make it easy and re-usable to draw segmented prompts # A few utility functions to make it easy and re-usable to draw segmented prompts
@@ -135,12 +146,9 @@ prompt_virtualenv() {
prompt_agnoster_main() { prompt_agnoster_main() {
RETVAL=$? RETVAL=$?
CURRENT_BG='NONE' CURRENT_BG='NONE'
prompt_status for prompt_segment in "${AGNOSTER_PROMPT_SEGMENTS[@]}"; do
prompt_context [[ -n $prompt_segment ]] && $prompt_segment
prompt_virtualenv done
prompt_dir
prompt_git
prompt_end
} }
prompt_agnoster_precmd() { prompt_agnoster_precmd() {

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 MiB

View File

@@ -60,10 +60,18 @@ Simply stops a worker and all active jobs will be terminated immediately.
Start a new asynchronous job on specified worker, assumes the worker is running. Start a new asynchronous job on specified worker, assumes the worker is running.
#### `async_worker_eval <worker_name> <my_function> [<function_params>]`
Evaluate a command (like async_job) inside the async worker, then worker environment can be manipulated. For example, issuing a cd command will change the PWD of the worker which will then be inherited by all future async jobs.
Output will be returned via callback, job name will be [async/eval].
#### `async_process_results <worker_name> <callback_function>` #### `async_process_results <worker_name> <callback_function>`
Get results from finished jobs and pass it to the to callback function. This is the only way to reliably return the job name, return code, output and execution time and with minimal effort. Get results from finished jobs and pass it to the to callback function. This is the only way to reliably return the job name, return code, output and execution time and with minimal effort.
If the async process buffer becomes corrupt, the callback will be invoked with the first argument being `[async]` (job name), non-zero return code and fifth argument describing the error (stderr).
The `callback_function` is called with the following parameters: The `callback_function` is called with the following parameters:
* `$1` job name, e.g. the function passed to async_job * `$1` job name, e.g. the function passed to async_job

View File

@@ -3,15 +3,26 @@
# #
# zsh-async # zsh-async
# #
# version: 1.6.2 # version: 1.7.1
# author: Mathias Fredriksson # author: Mathias Fredriksson
# url: https://github.com/mafredri/zsh-async # url: https://github.com/mafredri/zsh-async
# #
typeset -g ASYNC_VERSION=1.6.2 typeset -g ASYNC_VERSION=1.7.1
# Produce debug output from zsh-async when set to 1. # Produce debug output from zsh-async when set to 1.
typeset -g ASYNC_DEBUG=${ASYNC_DEBUG:-0} typeset -g ASYNC_DEBUG=${ASYNC_DEBUG:-0}
# Execute commands that can manipulate the environment inside the async worker. Return output via callback.
_async_eval() {
local ASYNC_JOB_NAME
# Rename job to _async_eval and redirect all eval output to cat running
# in _async_job. Here, stdout and stderr are not separated for
# simplicity, this could be improved in the future.
{
eval "$@"
} &> >(ASYNC_JOB_NAME=[async/eval] _async_job 'cat')
}
# Wrapper for jobs executed by the async worker, gives output in parseable format with execution time # Wrapper for jobs executed by the async worker, gives output in parseable format with execution time
_async_job() { _async_job() {
# Disable xtrace as it would mangle the output. # Disable xtrace as it would mangle the output.
@@ -26,6 +37,7 @@ _async_job() {
# block, after the command block has completed, the stdin for `cat` is # block, after the command block has completed, the stdin for `cat` is
# closed, causing stderr to be appended with a $'\0' at the end to mark the # closed, causing stderr to be appended with a $'\0' at the end to mark the
# end of output from this job. # end of output from this job.
local jobname=${ASYNC_JOB_NAME:-$1}
local stdout stderr ret tok local stdout stderr ret tok
{ {
stdout=$(eval "$@") stdout=$(eval "$@")
@@ -36,7 +48,7 @@ _async_job() {
read -r -k 1 -p tok || exit 1 read -r -k 1 -p tok || exit 1
# Return output (<job_name> <return_code> <stdout> <duration> <stderr>). # Return output (<job_name> <return_code> <stdout> <duration> <stderr>).
print -r -n - ${(q)1} $ret ${(q)stdout} $duration print -r -n - $'\0'${(q)jobname} $ret ${(q)stdout} $duration
} 2> >(stderr=$(cat) && print -r -n - " "${(q)stderr}$'\0') } 2> >(stderr=$(cat) && print -r -n - " "${(q)stderr}$'\0')
# Unlock mutex by inserting a token. # Unlock mutex by inserting a token.
@@ -80,7 +92,7 @@ _async_worker() {
unset $zsh_hook_functions # And hooks with registered functions. unset $zsh_hook_functions # And hooks with registered functions.
unset zsh_hooks zsh_hook_functions # Cleanup. unset zsh_hooks zsh_hook_functions # Cleanup.
child_exit() { close_idle_coproc() {
local -a pids local -a pids
pids=(${${(v)jobstates##*:*:}%\=*}) pids=(${${(v)jobstates##*:*:}%\=*})
@@ -90,6 +102,10 @@ _async_worker() {
coproc : coproc :
coproc_pid=0 coproc_pid=0
fi fi
}
child_exit() {
close_idle_coproc
# On older version of zsh (pre 5.2) we notify the parent through a # On older version of zsh (pre 5.2) we notify the parent through a
# SIGWINCH signal because `zpty` did not return a file descriptor (fd) # SIGWINCH signal because `zpty` did not return a file descriptor (fd)
@@ -132,7 +148,7 @@ _async_worker() {
coproc_pid=0 # Reset pid. coproc_pid=0 # Reset pid.
} }
local request local request do_eval=0
local -a cmd local -a cmd
while :; do while :; do
# Wait for jobs sent by async_job. # Wait for jobs sent by async_job.
@@ -149,6 +165,7 @@ _async_worker() {
case $request in case $request in
_unset_trap) notify_parent=0; continue;; _unset_trap) notify_parent=0; continue;;
_killjobs) killjobs; continue;; _killjobs) killjobs; continue;;
_async_eval*) do_eval=1;;
esac esac
# Parse the request using shell parsing (z) to allow commands # Parse the request using shell parsing (z) to allow commands
@@ -181,19 +198,35 @@ _async_worker() {
print -n -p "t" print -n -p "t"
fi fi
if (( do_eval )); then
shift cmd # Strip _async_eval from cmd.
_async_eval $cmd
else
# Run job in background, completed jobs are printed to stdout. # Run job in background, completed jobs are printed to stdout.
_async_job $cmd & _async_job $cmd &
# Store pid because zsh job manager is extremely unflexible (show jobname as non-unique '$job')... # Store pid because zsh job manager is extremely unflexible (show jobname as non-unique '$job')...
storage[$job]="$!" storage[$job]="$!"
fi
processing=0 # Disable guard. processing=0 # Disable guard.
if (( do_eval )); then
do_eval=0
# When there are no active jobs we can't rely on the CHLD trap to
# manage the coproc lifetime.
close_idle_coproc
fi
done done
} }
# #
# Get results from finnished jobs and pass it to the to callback function. This is the only way to reliably return the # Get results from finished jobs and pass it to the to callback function. This is the only way to reliably return the
# job name, return code, output and execution time and with minimal effort. # job name, return code, output and execution time and with minimal effort.
# #
# If the async process buffer becomes corrupt, the callback will be invoked with the first argument being `[async]` (job
# name), non-zero return code and fifth argument describing the error (stderr).
#
# usage: # usage:
# async_process_results <worker_name> <callback_function> # async_process_results <worker_name> <callback_function>
# #
@@ -245,13 +278,15 @@ async_process_results() {
if (( $#items == 5 )); then if (( $#items == 5 )); then
items+=($has_next) items+=($has_next)
$callback "${(@)items}" # Send all parsed items to the callback. $callback "${(@)items}" # Send all parsed items to the callback.
(( num_processed++ ))
elif [[ -z $items ]]; then
# Empty items occur between results due to double-null ($'\0\0')
# caused by commands being both pre and suffixed with null.
else else
# In case of corrupt data, invoke callback with *async* as job # In case of corrupt data, invoke callback with *async* as job
# name, non-zero exit status and an error message on stderr. # name, non-zero exit status and an error message on stderr.
$callback "async" 1 "" 0 "$0:$LINENO: error: bad format, got ${#items} items (${(q)items})" $has_next $callback "[async]" 1 "" 0 "$0:$LINENO: error: bad format, got ${#items} items (${(q)items})" $has_next
fi fi
(( num_processed++ ))
done done
done done
@@ -297,6 +332,30 @@ async_job() {
zpty -w $worker "$cmd"$'\0' zpty -w $worker "$cmd"$'\0'
} }
#
# Evaluate a command (like async_job) inside the async worker, then worker environment can be manipulated. For example,
# issuing a cd command will change the PWD of the worker which will then be inherited by all future async jobs.
#
# Output will be returned via callback, job name will be [async/eval].
#
# usage:
# async_worker_eval <worker_name> <my_function> [<function_params>]
#
async_worker_eval() {
setopt localoptions noshwordsplit noksharrays noposixidentifiers noposixstrings
local worker=$1; shift
local -a cmd
cmd=("$@")
if (( $#cmd > 1 )); then
cmd=(${(q)cmd}) # Quote special characters in multi argument commands.
fi
# Quote the cmd in case RC_EXPAND_PARAM is set.
zpty -w $worker "_async_eval $cmd"$'\0'
}
# This function traps notification signals and calls all registered callbacks # This function traps notification signals and calls all registered callbacks
_async_notify_trap() { _async_notify_trap() {
setopt localoptions noshwordsplit setopt localoptions noshwordsplit

View File

@@ -7,8 +7,8 @@ test__async_job_print_hi() {
local line local line
local -a out local -a out
line=$(_async_job print hi) line=$(_async_job print hi)
# Remove trailing null, parse, unquote and interpret as array. # Remove leading/trailing null, parse, unquote and interpret as array.
line=$line[1,$#line-1] line=$line[2,$#line-1]
out=("${(@Q)${(z)line}}") out=("${(@Q)${(z)line}}")
coproc exit coproc exit
@@ -396,8 +396,10 @@ test_async_flush_jobs() {
# TODO: Confirm that they no longer exist in the process tree. # TODO: Confirm that they no longer exist in the process tree.
local output local output
output="${(Q)$(ASYNC_DEBUG=1 async_flush_jobs test)}" output="${(Q)$(ASYNC_DEBUG=1 async_flush_jobs test)}"
[[ $output = *'print_four 0 4'* ]] || { # NOTE(mafredri): First 'p' in print_four is lost when null-prefixing
t_error "want discarded output 'print_four 0 4' when ASYNC_DEBUG=1, got ${(Vq-)output}" # _async_job output.
[[ $output = *'rint_four 0 4'* ]] || {
t_error "want discarded output 'rint_four 0 4' when ASYNC_DEBUG=1, got ${(Vq-)output}"
} }
# Check that the killed job did not produce output. # Check that the killed job did not produce output.
@@ -430,6 +432,35 @@ test_async_worker_survives_termination_of_other_worker() {
(( $#result == 6 )) || t_error "wanted a result, got (${(@Vq)result})" (( $#result == 6 )) || t_error "wanted a result, got (${(@Vq)result})"
} }
test_async_worker_update_pwd() {
local -a result
local eval_out
cb() {
if [[ $1 == '[async/eval]' ]]; then
eval_out="$3"
else
result+=("$3")
fi
}
async_start_worker test1
t_defer async_stop_worker test1
async_job test1 'print $PWD'
async_worker_eval test1 'print -n foo; cd ..; print -n bar; print -n -u2 baz'
async_job test1 'print $PWD'
start=$EPOCHREALTIME
while (( EPOCHREALTIME - start < 2.0 && $#result < 3 )); do
async_process_results test1 cb
done
(( $#result == 2 )) || t_error "wanted 2 results, got ${#result}"
[[ $eval_out = foobarbaz ]] || t_error "wanted async_worker_eval to output foobarbaz, got ${(q)eval_out}"
[[ -n $result[2] ]] || t_error "wanted second pwd to be non-empty"
[[ $result[1] != $result[2] ]] || t_error "wanted worker to change pwd, was ${(q)result[1]}, got ${(q)result[2]}"
}
setopt_helper() { setopt_helper() {
setopt localoptions $1 setopt localoptions $1

View File

@@ -1,3 +1,21 @@
## v0.6.6
- The `rbenv` segment is no longer a default segment in the LPROMPT.
- PR #959 - Fixing issue in v0.6.5 where we changed some color codes.
- PR #934 - Add Tests
- PR #884 - test-in-docker: fix with newer ZSH versions
- PR #928 - [Docs] Add etc state description in dir docs
- PR #937 - Use SUDO_COMMAND to check for sudo
- PR #925 - [Bugfix] Resolve #918 Transparent background
- PR #923 - Fix font issue debugging script
- PR #921 - Add missing colors to fix color comparison
- PR #951 - Add fallback icon for missing linux distro icons
- PR #956 - Fix broken link in readme
- Fixed #936 - fallback icons for Linux distros
- Fixed #926 - `etc` state for `dir` segment in docs
- Fixed #852 - `sudo` detection got crazy, there. sorry, everyone.
- Fixed #927 - more default color issues.
## v0.6.5 ## v0.6.5
- Multiple PRs: General fixes to README, improved documentation. - Multiple PRs: General fixes to README, improved documentation.

View File

@@ -70,7 +70,7 @@ variables to your `~/.zshrc`.
| Variable | Default Value | Description | | Variable | Default Value | Description |
|----------|---------------|-------------| |----------|---------------|-------------|
|`POWERLEVEL9K_LEFT_PROMPT_ELEMENTS`|`(context dir rbenv vcs)`|Segment list for left prompt| |`POWERLEVEL9K_LEFT_PROMPT_ELEMENTS`|`(context dir vcs)`|Segment list for left prompt|
|`POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS`|`(status root_indicator background_jobs history time)`|Segment list for right prompt| |`POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS`|`(status root_indicator background_jobs history time)`|Segment list for right prompt|
@@ -78,7 +78,7 @@ The table above shows the default values, so if you wanted to set these
variables manually, you would put the following in variables manually, you would put the following in
your `~/.zshrc`: your `~/.zshrc`:
```zsh ```zsh
POWERLEVEL9K_LEFT_PROMPT_ELEMENTS=(context dir rbenv vcs) POWERLEVEL9K_LEFT_PROMPT_ELEMENTS=(context dir vcs)
POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS=(status root_indicator background_jobs history time) POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS=(status root_indicator background_jobs history time)
``` ```
#### Available Prompt Segments #### Available Prompt Segments
@@ -357,13 +357,24 @@ end of the hostname.
|`POWERLEVEL9K_ALWAYS_SHOW_USER`|false|Always show the username, but conditionalize the hostname.| |`POWERLEVEL9K_ALWAYS_SHOW_USER`|false|Always show the username, but conditionalize the hostname.|
|`POWERLEVEL9K_CONTEXT_TEMPLATE`|%n@%m|Default context prompt (username@machine). Refer to the [ZSH Documentation](http://zsh.sourceforge.net/Doc/Release/Prompt-Expansion.html) for all possible expansions, including deeper host depths.| |`POWERLEVEL9K_CONTEXT_TEMPLATE`|%n@%m|Default context prompt (username@machine). Refer to the [ZSH Documentation](http://zsh.sourceforge.net/Doc/Release/Prompt-Expansion.html) for all possible expansions, including deeper host depths.|
This segment can have different states. They might help you to visualize your
different privileges. Read more about styling with states [here](https://github.com/bhilburn/powerlevel9k/wiki/Stylizing-Your-Prompt#special-segment-colors).
| State | Meaning |
|---------------|----------------------------------------------------------|
| `DEFAULT` | You are a normal user |
| `ROOT` | You are the root user |
| `SUDO` | You are using elevated rights |
| `REMOTE_SUDO` | You are SSH'ed into the machine and have elevated rights |
| `REMOTE` | You are SSH'ed into the machine |
##### date ##### date
The `date` segment shows the current system date. The `date` segment shows the current system date.
| Variable | Default Value | Description | | Variable | Default Value | Description |
|----------|---------------|-------------| |----------|---------------|-------------|
|`POWERLEVEL9K_DATE_FORMAT`|`%D{%d.%m.%y}`|[ZSH time format](http://zsh.sourceforge.net/Doc/Release Prompt-Expansion.html) to use in this segment.| |`POWERLEVEL9K_DATE_FORMAT`|`%D{%d.%m.%y}`|[ZSH time format](http://zsh.sourceforge.net/Doc/Release/Prompt-Expansion.html#Date-and-time) to use in this segment.|
##### dir ##### dir
@@ -375,12 +386,14 @@ Powerline" fonts, there are additional glyphs, as well:
| None | None | ![](https://cloud.githubusercontent.com/assets/1544760/12183451/40ec4016-b58f-11e5-9b9e-74e2b2f0b8b3.png) | At the root of your home folder | | None | None | ![](https://cloud.githubusercontent.com/assets/1544760/12183451/40ec4016-b58f-11e5-9b9e-74e2b2f0b8b3.png) | At the root of your home folder |
| None | None | ![](https://cloud.githubusercontent.com/assets/1544760/12369315/8a5d762c-bbf5-11e5-8a20-ca1179f48d6c.png) | Within a subfolder of your home directory | | None | None | ![](https://cloud.githubusercontent.com/assets/1544760/12369315/8a5d762c-bbf5-11e5-8a20-ca1179f48d6c.png) | Within a subfolder of your home directory |
| None | None | ![](https://cloud.githubusercontent.com/assets/1544760/12183452/40f79286-b58f-11e5-9b8c-ed1343a07b08.png) | Outside of your home folder | | None | None | ![](https://cloud.githubusercontent.com/assets/1544760/12183452/40f79286-b58f-11e5-9b8c-ed1343a07b08.png) | Outside of your home folder |
| None | None | ⚙ | Within the `/etc` directory |
To turn off these icons you could set these variables to an empty string. To turn off these icons you could set these variables to an empty string.
```zsh ```zsh
POWERLEVEL9K_HOME_ICON='' POWERLEVEL9K_HOME_ICON=''
POWERLEVEL9K_HOME_SUB_ICON='' POWERLEVEL9K_HOME_SUB_ICON=''
POWERLEVEL9K_FOLDER_ICON='' POWERLEVEL9K_FOLDER_ICON=''
POWERLEVEL9K_ETC_ICON=''
``` ```
You can limit the output to a certain length by truncating long paths. You can limit the output to a certain length by truncating long paths.
Customizations available are: Customizations available are:
@@ -561,6 +574,19 @@ Variable | Default Value | Description |
|----------|---------------|-------------| |----------|---------------|-------------|
|`POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW`|`false`|Set to true if you wish to show the rbenv segment even if the current Ruby version is the same as the global Ruby version| |`POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW`|`false`|Set to true if you wish to show the rbenv segment even if the current Ruby version is the same as the global Ruby version|
##### pyenv
This segment shows the version of Python being used when using `pyenv` to change your current Python stack.
The `PYENV_VERSION` environment variable will be used if specified. Otherwise it figures out the version being used by taking the output of the `pyenv version-name` command.
* If `pyenv` is not in $PATH, nothing will be shown.
* If the current Python version is the same as the global Python version, nothing will be shown.
| Variable | Default Value | Description |
|----------|---------------|-------------|
|`POWERLEVEL9K_PYENV_PROMPT_ALWAYS_SHOW`|`false`|Set to true if you wish to show the pyenv segment even if the current Python version is the same as the global Python version|
##### rspec_stats ##### rspec_stats
See [Unit Test Ratios](#unit-test-ratios), below. See [Unit Test Ratios](#unit-test-ratios), below.

View File

@@ -1,25 +1,45 @@
# Structure # Tests
The Unit-Tests do not follow exactly the file structure of Powerlevel9k itself. ## Automated Tests
## Basic Tests The Unit-Tests do not follow exactly the file structure of Powerlevel9k itself,
but we try to reflect the structure as much as possible. All tests are located
under `test/`. Segment specific tests under `test/segments/` (one file per
segment).
Basic Tests belong in `test/powerlevel9k.spec` if they test basic functionality of ### Installation
Powerlevel9k itself. Basic functions from the `functions` directory have their
Tests in separate files under `test/functions`.
## Segment Tests In order to execute the tests you need to install `shunit2`, which is a
submodule. To install the submodule, you can execute
`git submodule init && git submodule update`.
These Tests tend to be more complex in setup than the basic tests. To avoid ending ### Executing tests
up in a huge single file, there is one file per segment in `test/segments`.
# Manual Testing The tests are shell scripts on their own. So you can execute them right away.
To execute all tests you could just execute `./test/suite.spec`.
### General Test Structure
The tests usually have a `setUp()` function which is executed before every
test function. Speaking of, test functions must be prefixed with `test`. In
the tests, you can do [different Assertions](https://github.com/kward/shunit2#-asserts).
It is always a good idea to mock the program you want to test (just have a
look at other tests), so that the testrunner does not have to have all
programs installed.
### Travis
We use [Travis](https://travis-ci.org/) for Continuous Integration. This
service executes our tests after every push. For now, we need to tell travis
where to find the tests, which is what happens in the `.travis.yml` file.
## Manual Testing
If unit tests are not sufficient (e.g. you have an issue with your prompt that If unit tests are not sufficient (e.g. you have an issue with your prompt that
occurs only in a specific ZSH framework) then you can use either Docker or occurs only in a specific ZSH framework) then you can use either Docker or
or our Vagrant. or our Vagrant.
## Docker ### Docker
This is the easiest to use _if_ you have Docker already installed and running. This is the easiest to use _if_ you have Docker already installed and running.
@@ -42,7 +62,7 @@ You can get Docker at <https://www.docker.com/community-edition>.
**Note:** Not all frameworks work with all versions of ZSH (or the underlying OS). **Note:** Not all frameworks work with all versions of ZSH (or the underlying OS).
## Vagrant ### Vagrant
Currently there are two test VMs. `test-vm` is an Ubuntu machine with several Currently there are two test VMs. `test-vm` is an Ubuntu machine with several
pre-installed ZSH frameworks. And there is `test-bsd-vm` which is a FreeBSD! pre-installed ZSH frameworks. And there is `test-bsd-vm` which is a FreeBSD!

View File

@@ -1,9 +1,67 @@
#!/usr/bin/env zsh #!/usr/bin/env zsh
#vim:ft=zsh ts=2 sw=2 sts=2 et fenc=utf-8 #vim:ft=zsh ts=2 sw=2 sts=2 et fenc=utf-8
source functions/colors.zsh
source functions/icons.zsh
source functions/utilities.zsh
# Map our $OS to neofetch $os
os="$OS"
trim() {
set -f
# shellcheck disable=2048,2086
set -- $*
printf '%s\n' "${*//[[:space:]]/}"
set +f
}
get_ppid() {
# Get parent process ID of PID.
case "$os" in
"Windows")
ppid="$(ps -p "${1:-$PPID}" | awk '{printf $2}')"
ppid="${ppid/PPID}"
;;
"Linux")
ppid="$(grep -i -F "PPid:" "/proc/${1:-$PPID}/status")"
ppid="$(trim "${ppid/PPid:}")"
;;
*)
ppid="$(ps -p "${1:-$PPID}" -o ppid=)"
;;
esac
printf "%s" "$ppid"
}
get_process_name() {
# Get PID name.
case "$os" in
"Windows")
name="$(ps -p "${1:-$PPID}" | awk '{printf $8}')"
name="${name/COMMAND}"
name="${name/*\/}"
;;
"Linux")
name="$(< "/proc/${1:-$PPID}/comm")"
;;
*)
name="$(ps -p "${1:-$PPID}" -o comm=)"
;;
esac
printf "%s" "$name"
}
# Taken from NeoFetch (slightly modified) # Taken from NeoFetch (slightly modified)
get_term() { get_term() {
local term local term
# If function was run, stop here. # If function was run, stop here.
# ((term_run == 1)) && return # ((term_run == 1)) && return
@@ -16,16 +74,34 @@ get_term() {
*) term="${TERM_PROGRAM/\.app}" ;; *) term="${TERM_PROGRAM/\.app}" ;;
esac esac
# Most likely TosWin2 on FreeMiNT - quick check
[[ "$TERM" == "tw52" || "$TERM" == "tw100" ]] && \
term="TosWin2"
[[ "$SSH_CONNECTION" ]] && \
term="$SSH_TTY"
# Check $PPID for terminal emulator. # Check $PPID for terminal emulator.
while [[ -z "$term" ]]; do while [[ -z "$term" ]]; do
parent="$(get_ppid "$parent")" parent="$(get_ppid "$parent")"
[[ -z "$parent" ]] && break
name="$(get_process_name "$parent")" name="$(get_process_name "$parent")"
case "${name// }" in case "${name// }" in
"${SHELL/*\/}" | *"sh" | "tmux"* | "screen" | "su"*) ;; "${SHELL/*\/}"|*"sh"|"screen"|"su"*) ;;
"login"* | *"Login"* | "init" | "(init)") term="$(tty)" ;;
"ruby" | "1" | "systemd" | "sshd"* | "python"* | "USER"*"PID"*) break ;; "login"*|*"Login"*|"init"|"(init)")
term="$(tty)"
;;
"ruby"|"1"|"tmux"*|"systemd"|"sshd"*|"python"*|"USER"*"PID"*|"kdeinit"*|"launchd"*)
break
;;
"gnome-terminal-") term="gnome-terminal" ;; "gnome-terminal-") term="gnome-terminal" ;;
"urxvtd") term="urxvt" ;;
*"nvim") term="Neovim Terminal" ;;
*"NeoVimServer"*) term="VimR Terminal" ;;
*) term="${name##*/}" ;; *) term="${name##*/}" ;;
esac esac
done done
@@ -42,70 +118,119 @@ get_term_font() {
case "$term" in case "$term" in
"alacritty"*) "alacritty"*)
term_font="$(awk -F ':|#' '/normal:/ {getline; print}' "${XDG_CONFIG_HOME}/alacritty/alacritty.yml")" shopt -s nullglob
confs=({$XDG_CONFIG_HOME,$HOME}/{alacritty,}/{.,}alacritty.ym?)
shopt -u nullglob
[[ -f "${confs[0]}" ]] || return
term_font="$(awk -F ':|#' '/normal:/ {getline; print}' "${confs[0]}")"
term_font="${term_font/*family:}" term_font="${term_font/*family:}"
term_font="${term_font/$'\n'*}" term_font="${term_font/$'\n'*}"
term_font="${term_font/\#*}" term_font="${term_font/\#*}"
;; ;;
"Apple_Terminal") "Apple_Terminal")
term_font="$(osascript -e 'tell application "Terminal" to font name of window frontmost')" term_font="$(osascript <<END
tell application "Terminal" to font name of window frontmost
END
)"
;; ;;
"iTerm2") "iTerm2")
# Unfortunately the profile name is not unique, but it seems to be the only thing # Unfortunately the profile name is not unique, but it seems to be the only thing
# that identifies an active profile. There is the "id of current session of current window" # that identifies an active profile. There is the "id of current session of current win-
# thou, but that does not match to a guid in the plist. # dow" though, but that does not match to a guid in the plist.
# So, be warned! Collisions may occur! # So, be warned, collisions may occur!
# See: https://groups.google.com/forum/#!topic/iterm2-discuss/0tO3xZ4Zlwg # See: https://groups.google.com/forum/#!topic/iterm2-discuss/0tO3xZ4Zlwg
# and: https://gitlab.com/gnachman/iterm2/issues/5586 local current_profile_name profiles_count profile_name diff_font
local currentProfileName=$(osascript -e 'tell application "iTerm2" to profile name of current session of current window')
current_profile_name="$(osascript <<END
tell application "iTerm2" to profile name \
of current session of current window
END
)"
# Warning: Dynamic profiles are not taken into account here! # Warning: Dynamic profiles are not taken into account here!
# https://www.iterm2.com/documentation-dynamic-profiles.html # https://www.iterm2.com/documentation-dynamic-profiles.html
font_file="${HOME}/Library/Preferences/com.googlecode.iterm2.plist"
local nonAsciiFont
# Count Guids in "New Bookmarks"; they should be unique # Count Guids in "New Bookmarks"; they should be unique
local profilesCount=$(/usr/libexec/PlistBuddy -c "Print :New\ Bookmarks:" ~/Library/Preferences/com.googlecode.iterm2.plist 2>/dev/null | grep -c "Guid") profiles_count="$(/usr/libexec/PlistBuddy -c "Print ':New Bookmarks:'" "$font_file" | \
for idx in $(seq 0 "${profilesCount}"); do grep -w -c "Guid")"
local profileName=$(/usr/libexec/PlistBuddy -c "Print :New\ Bookmarks:${idx}:Name:" ~/Library/Preferences/com.googlecode.iterm2.plist 2>/dev/null)
if [[ "${profileName}" == "${currentProfileName}" ]]; then for ((i=0; i<profiles_count; i++)); do
profile_name="$(/usr/libexec/PlistBuddy -c "Print ':New Bookmarks:${i}:Name:'" "$font_file")"
if [[ "$profile_name" == "$current_profile_name" ]]; then
# "Normal Font" # "Normal Font"
term_font=$(/usr/libexec/PlistBuddy -c "Print :New\ Bookmarks:${idx}:Normal\ Font:" ~/Library/Preferences/com.googlecode.iterm2.plist) term_font="$(/usr/libexec/PlistBuddy -c "Print ':New Bookmarks:${i}:Normal Font:'" \
"$font_file")"
# Font for non-ascii characters # Font for non-ascii characters
# Only check for a different non-ascii font, if the user checked # Only check for a different non-ascii font, if the user checked
# the "use a different font for non-ascii text" switch. # the "use a different font for non-ascii text" switch.
local useDifferentFont=$(/usr/libexec/PlistBuddy -c "Print :New\ Bookmarks:${idx}:Use\ Non-ASCII\ Font:" ~/Library/Preferences/com.googlecode.iterm2.plist) diff_font="$(/usr/libexec/PlistBuddy -c "Print ':New Bookmarks:${i}:Use Non-ASCII Font:'" \
if [[ "$useDifferentFont" == "true" ]]; then "$font_file")"
local nonAsciiFont=$(/usr/libexec/PlistBuddy -c "Print :New\ Bookmarks:${idx}:Non\ Ascii\ Font:" ~/Library/Preferences/com.googlecode.iterm2.plist)
if [[ "$term_font" != "$nonAsciiFont" ]]; then if [[ "$diff_font" == "true" ]]; then
term_font="$term_font (normal) / $nonAsciiFont (non-ascii)" non_ascii="$(/usr/libexec/PlistBuddy -c "Print ':New Bookmarks:${i}:Non Ascii Font:'" \
fi "$font_file")"
[[ "$term_font" != "$non_ascii" ]] && \
term_font="$term_font (normal) / $non_ascii (non-ascii)"
fi fi
fi fi
done done
;; ;;
"deepin-terminal"*) "deepin-terminal"*)
term_font="$(awk -F '=' '/font=/ {a=$2} /font_size/ {b=$2} END{print a " " b}' "${XDG_CONFIG_HOME}/deepin/deepin-terminal/config.conf")" term_font="$(awk -F '=' '/font=/ {a=$2} /font_size/ {b=$2} END {print a " " b}' \
"${XDG_CONFIG_HOME}/deepin/deepin-terminal/config.conf")"
;;
"GNUstep_Terminal")
term_font="$(awk -F '>|<' '/>TerminalFont</ {getline; f=$3}
/>TerminalFontSize</ {getline; s=$3} END {print f " " s}' \
"${HOME}/GNUstep/Defaults/Terminal.plist")"
;; ;;
"Hyper"*) "Hyper"*)
term_font="$(awk -F "," '/fontFamily/ {a=$1} END{print a}' "${HOME}/.hyper.js" | awk -F "'" '{a=$2} END{print a}')" term_font="$(awk -F':|,' '/fontFamily/ {print $2; exit}' "${HOME}/.hyper.js")"
term_font="$(trim_quotes "$term_font")"
;;
"kitty"*)
shopt -s nullglob
confs=({$KITTY_CONFIG_DIRECTORY,$XDG_CONFIG_HOME,~/Library/Preferences}/kitty/kitty.con?)
shopt -u nullglob
[[ -f "${confs[0]}" ]] || return
term_font="$(awk '/^([[:space:]]*|[^#_])font_family[[:space:]]+/ {
$1 = "";
gsub(/^[[:space:]]/, "");
font = $0
}
/^([[:space:]]*|[^#_])font_size[[:space:]]+/ {
size = $2
}
END { print font " " size}' "${confs[0]}")"
;; ;;
"konsole"*) "konsole"*)
# Get Process ID of current konsole window / tab # Get Process ID of current konsole window / tab
child="$(get_ppid "$$")" child="$(get_ppid "$$")"
konsole_instances=($(qdbus | grep 'org.kde.konsole')) IFS=$'\n' read -d "" -ra konsole_instances < <(qdbus | grep -F 'org.kde.konsole')
for i in "${konsole_instances[@]}"; do for i in "${konsole_instances[@]}"; do
konsole_sessions=($(qdbus "${i}" | grep '/Sessions/')) IFS=$'\n' read -d "" -ra konsole_sessions < <(qdbus "$i" | grep -F '/Sessions/')
for session in "${konsole_sessions[@]}"; do for session in "${konsole_sessions[@]}"; do
if ((child == "$(qdbus "${i}" "${session}" processId)")); then if ((child == "$(qdbus "$i" "$session" processId)")); then
profile="$(qdbus "${i}" "${session}" environment | awk -F '=' '/KONSOLE_PROFILE_NAME/ {print $2}')" profile="$(qdbus "$i" "$session" environment |\
awk -F '=' '/KONSOLE_PROFILE_NAME/ {print $2}')"
break break
fi fi
done done
@@ -113,9 +238,53 @@ get_term_font() {
done done
# We could have two profile files for the same profile name, take first match # We could have two profile files for the same profile name, take first match
profile_filename="$(grep -l "Name=${profile}" "${HOME}"/.local/share/konsole/*.profile)" profile_filename="$(grep -l "Name=${profile}" "$HOME"/.local/share/konsole/*.profile)"
profile_filename="${profile_filename/$'\n'*}" profile_filename="${profile_filename/$'\n'*}"
[[ "$profile_filename" ]] && term_font="$(awk -F '=|,' '/Font=/ {print $2 " " $3}' "$profile_filename")"
[[ "$profile_filename" ]] && \
term_font="$(awk -F '=|,' '/Font=/ {print $2 " " $3}' "$profile_filename")"
;;
"lxterminal"*)
term_font="$(awk -F '=' '/fontname=/ {print $2; exit}' \
"${XDG_CONFIG_HOME}/lxterminal/lxterminal.conf")"
;;
"mate-terminal")
# To get the actual config we have to create a temporarily file with the
# --save-config option.
mateterm_config="/tmp/mateterm.cfg"
# Ensure /tmp exists and we do not overwrite anything.
if [[ -d /tmp && ! -f "$mateterm_config" ]]; then
mate-terminal --save-config="$mateterm_config"
role="$(xprop -id "${WINDOWID}" WM_WINDOW_ROLE)"
role="${role##* }"
role="${role//\"}"
profile="$(awk -F '=' -v r="$role" \
'$0~r {
getline;
if(/Maximized/) getline;
if(/Fullscreen/) getline;
id=$2"]"
} $0~id {if(id) {getline; print $2; exit}}' \
"$mateterm_config")"
rm -f "$mateterm_config"
mate_get() {
gsettings get org.mate.terminal.profile:/org/mate/terminal/profiles/"$1"/ "$2"
}
if [[ "$(mate_get "$profile" "use-system-font")" == "true" ]]; then
term_font="$(gsettings get org.mate.interface monospace-font-name)"
else
term_font="$(mate_get "$profile" "font")"
fi
term_font="$(trim_quotes "$term_font")"
fi
;; ;;
"mintty") "mintty")
@@ -124,30 +293,91 @@ get_term_font() {
"pantheon"*) "pantheon"*)
term_font="$(gsettings get org.pantheon.terminal.settings font)" term_font="$(gsettings get org.pantheon.terminal.settings font)"
[[ -z "${term_font//\'}" ]] && term_font="$(gsettings get org.gnome.desktop.interface monospace-font-name)"
[[ -z "${term_font//\'}" ]] && \
term_font="$(gsettings get org.gnome.desktop.interface monospace-font-name)"
term_font="$(trim_quotes "$term_font")" term_font="$(trim_quotes "$term_font")"
;; ;;
"qterminal")
term_font="$(awk -F '=' '/fontFamily=/ {a=$2} /fontSize=/ {b=$2} END {print a " " b}' \
"${XDG_CONFIG_HOME}/qterminal.org/qterminal.ini")"
;;
"sakura"*) "sakura"*)
term_font="$(awk -F '=' '/^font=/ {a=$2} END{print a}' "${XDG_CONFIG_HOME}/sakura/sakura.conf")" term_font="$(awk -F '=' '/^font=/ {print $2; exit}' \
"${XDG_CONFIG_HOME}/sakura/sakura.conf")"
;;
"st")
term_font="$(ps -o command= -p "$parent" | grep -F -- "-f")"
if [[ "$term_font" ]]; then
term_font="${term_font/*-f/}"
term_font="${term_font/ -*/}"
else
# On Linux we can get the exact path to the running binary through the procfs
# (in case `st` is launched from outside of $PATH) on other systems we just
# have to guess and assume `st` is invoked from somewhere in the users $PATH
[[ -L /proc/$parent/exe ]] && binary="/proc/$parent/exe" || binary="$(type -p st)"
# Grep the output of strings on the `st` binary for anything that looks vaguely
# like a font definition. NOTE: There is a slight limitation in this approach.
# Technically "Font Name" is a valid font. As it doesn't specify any font options
# though it is hard to match it correctly amongst the rest of the noise.
[[ -n "$binary" ]] && \
term_font="$(strings "$binary" | grep -F -m 1 \
-e "pixelsize=" \
-e "size=" \
-e "antialias=" \
-e "autohint=")"
fi
term_font="${term_font/xft:}"
term_font="${term_font/:*}"
;; ;;
"terminology") "terminology")
term_font="$(strings "${XDG_CONFIG_HOME}/terminology/config/standard/base.cfg" | awk '/^font\.name$/{print a}{a=$0}')" term_font="$(strings "${XDG_CONFIG_HOME}/terminology/config/standard/base.cfg" |\
awk '/^font\.name$/{print a}{a=$0}')"
term_font="${term_font/.pcf}" term_font="${term_font/.pcf}"
term_font="${term_font/:*}" term_font="${term_font/:*}"
;; ;;
"termite") "termite")
[[ -f "${XDG_CONFIG_HOME}/termite/config" ]] && termite_config="${XDG_CONFIG_HOME}/termite/config" [[ -f "${XDG_CONFIG_HOME}/termite/config" ]] && \
term_font="$(awk -F '= ' '/\[options\]/ {opt=1} /^font/ {if(opt==1) a=$2; opt=0} END{print a}' "/etc/xdg/termite/config" "$termite_config")" termite_config="${XDG_CONFIG_HOME}/termite/config"
term_font="$(awk -F '= ' '/\[options\]/ {
opt=1
}
/^\s*font/ {
if(opt==1) a=$2;
opt=0
} END {print a}' "/etc/xdg/termite/config" \
"$termite_config")"
;; ;;
"urxvt" | "urxvtd" | "rxvt-unicode" | "xterm") "urxvt" | "urxvtd" | "rxvt-unicode" | "xterm")
term_font="$(grep -i -F "${term/d}*font" < <(xrdb -query))" xrdb="$(xrdb -query)"
term_font="${term_font/*font:}" term_font="$(grep -im 1 -e "^${term/d}"'\**\.*font' -e '^\*font' <<< "$xrdb")"
term_font="${term_font/*"*font:"}"
term_font="${term_font/*".font:"}"
term_font="${term_font/*"*.font:"}"
term_font="$(trim "$term_font")" term_font="$(trim "$term_font")"
[[ -z "$term_font" && "$term" == "xterm" ]] && \
term_font="$(grep '^XTerm.vt100.faceName' <<< "$xrdb")"
term_font="$(trim "${term_font/*"faceName:"}")"
# xft: isn't required at the beginning so we prepend it if it's missing
[[ "${term_font:0:1}" != "-" && \
"${term_font:0:4}" != "xft:" ]] && \
term_font="xft:$term_font"
# Xresources has two different font formats, this checks which # Xresources has two different font formats, this checks which
# one is in use and formats it accordingly. # one is in use and formats it accordingly.
case "$term_font" in case "$term_font" in
@@ -156,12 +386,23 @@ get_term_font() {
term_font="${term_font/:*}" term_font="${term_font/:*}"
;; ;;
"-"*) term_font="$(awk -F '\\-' '{printf $3}' <<< "$term_font")" ;; "-"*)
IFS=- read -r _ _ term_font _ <<< "$term_font"
;;
esac esac
;; ;;
"xfce4-terminal") "xfce4-terminal")
term_font="$(awk -F '=' '/^FontName/ {a=$2} END{print a}' "${XDG_CONFIG_HOME}/xfce4/terminal/terminalrc")" term_font="$(awk -F '=' '/^FontName/{a=$2}/^FontUseSystem=TRUE/{a=$0} END {print a}' \
"${XDG_CONFIG_HOME}/xfce4/terminal/terminalrc")"
[[ "$term_font" == "FontUseSystem=TRUE" ]] && \
term_font="$(gsettings get org.gnome.desktop.interface monospace-font-name)"
term_font="$(trim_quotes "$term_font")"
# Default fallback font hardcoded in terminal-preferences.c
[[ -z "$term_font" ]] && term_font="Monospace 12"
;; ;;
esac esac

View File

@@ -6,7 +6,7 @@ RUN \
DEBIAN_FRONTEND=noninteractive apt-get install -y \ DEBIAN_FRONTEND=noninteractive apt-get install -y \
curl \ curl \
git \ git \
zsh \ zsh=5.0.2-3ubuntu6.2 \
mercurial \ mercurial \
subversion \ subversion \
golang \ golang \

View File

@@ -6,7 +6,7 @@ RUN \
DEBIAN_FRONTEND=noninteractive apt-get install -y \ DEBIAN_FRONTEND=noninteractive apt-get install -y \
curl \ curl \
git \ git \
zsh \ zsh=5.1.1-1ubuntu2.2 \
mercurial \ mercurial \
subversion \ subversion \
golang \ golang \

View File

@@ -1,4 +1,4 @@
FROM ubuntu:17.04 FROM ubuntu:17.10
RUN \ RUN \
apt-get update && \ apt-get update && \
@@ -6,7 +6,7 @@ RUN \
DEBIAN_FRONTEND=noninteractive apt-get install -y \ DEBIAN_FRONTEND=noninteractive apt-get install -y \
curl \ curl \
git \ git \
zsh \ zsh=5.2-5ubuntu1.2 \
mercurial \ mercurial \
subversion \ subversion \
golang \ golang \

View File

@@ -0,0 +1,40 @@
FROM debian:stretch
# We switched here to debian, as there seems no ZSH 5.3 in ubuntu.
RUN \
apt-get update && \
echo 'golang-go golang-go/dashboard boolean false' | debconf-set-selections && \
DEBIAN_FRONTEND=noninteractive apt-get install -y \
curl \
git \
zsh=5.3.1-4+b2 \
mercurial \
subversion \
golang \
jq \
nodejs \
ruby \
python \
python-virtualenv \
sudo \
locales
RUN adduser --shell /bin/zsh --gecos 'fred' --disabled-password fred
# Locale generation is different in debian. We need to enable en_US
# locale and then regenerate locales.
RUN echo "en_US.UTF-8 UTF-8" > /etc/locale.gen
RUN locale-gen "en_US.UTF-8"
COPY docker/fred-sudoers /etc/sudoers.d/fred
USER fred
WORKDIR /home/fred
ENV LANG=en_US.UTF-8
ENV TERM=xterm-256color
ENV DEFAULT_USER=fred
ENV POWERLEVEL9K_ALWAYS_SHOW_CONTEXT=true
RUN touch .zshrc
CMD ["/bin/zsh", "-l"]

View File

@@ -0,0 +1,35 @@
FROM ubuntu:18.04
RUN \
apt-get update && \
echo 'golang-go golang-go/dashboard boolean false' | debconf-set-selections && \
DEBIAN_FRONTEND=noninteractive apt-get install -y \
curl \
git \
zsh=5.4.2-3ubuntu3 \
mercurial \
subversion \
golang \
jq \
nodejs \
ruby \
python \
python-virtualenv \
sudo \
locales
RUN adduser --shell /bin/zsh --gecos 'fred' --disabled-password fred
RUN locale-gen "en_US.UTF-8"
COPY docker/fred-sudoers /etc/sudoers.d/fred
USER fred
WORKDIR /home/fred
ENV LANG=en_US.UTF-8
ENV TERM=xterm-256color
ENV DEFAULT_USER=fred
ENV POWERLEVEL9K_ALWAYS_SHOW_CONTEXT=true
RUN touch .zshrc
CMD ["/bin/zsh", "-l"]

View File

@@ -0,0 +1,35 @@
FROM ubuntu:18.10
RUN \
apt-get update && \
echo 'golang-go golang-go/dashboard boolean false' | debconf-set-selections && \
DEBIAN_FRONTEND=noninteractive apt-get install -y \
curl \
git \
zsh=5.5.1-1ubuntu1 \
mercurial \
subversion \
golang \
jq \
nodejs \
ruby \
python \
python-virtualenv \
sudo \
locales
RUN adduser --shell /bin/zsh --gecos 'fred' --disabled-password fred
RUN locale-gen "en_US.UTF-8"
COPY docker/fred-sudoers /etc/sudoers.d/fred
USER fred
WORKDIR /home/fred
ENV LANG=en_US.UTF-8
ENV TERM=xterm-256color
ENV DEFAULT_USER=fred
ENV POWERLEVEL9K_ALWAYS_SHOW_CONTEXT=true
RUN touch .zshrc
CMD ["/bin/zsh", "-l"]

View File

@@ -9,7 +9,7 @@ for rcfile in "${ZDOTDIR:-$HOME}"/.zprezto/runcoms/^README.md(.N); do
ln -nsf "$rcfile" "${ZDOTDIR:-$HOME}/.${rcfile:t}" ln -nsf "$rcfile" "${ZDOTDIR:-$HOME}/.${rcfile:t}"
done done
ln -s "${HOME}/p9k/powerlevel9k.zsh-theme" \ ln -snf "${HOME}/p9k/powerlevel9k.zsh-theme" \
"${HOME}/.zprezto/modules/prompt/functions/prompt_powerlevel9k_setup" "${HOME}/.zprezto/modules/prompt/functions/prompt_powerlevel9k_setup"
echo "zstyle ':prezto:module:prompt' theme 'powerlevel9k'" \ echo "zstyle ':prezto:module:prompt' theme 'powerlevel9k'" \

View File

@@ -0,0 +1,7 @@
ARG base
FROM p9k:${base}
COPY docker/zshing/install.zsh /tmp/
RUN zsh /tmp/install.zsh
COPY ./ p9k/

View File

@@ -0,0 +1,16 @@
#!zsh
# install zshing https://github.com/zakariaGatter/zshing
git clone https://github.com/zakariaGatter/zshing.git ~/.zshing/zshing
# Link P9K in zshing directory
ln -nsf ~/p9k ~/.zshing/powerlevel9k
{
echo
echo 'ZSHING_PLUGINS=(
"bhilburn/powerlevel9k"
)'
echo
echo "source ~/.zshing/zshing/zshing.zsh"
} >> ~/.zshrc

View File

@@ -6,6 +6,270 @@
# https://github.com/bhilburn/powerlevel9k # https://github.com/bhilburn/powerlevel9k
################################################################ ################################################################
typeset -gAh __P9K_COLORS
# https://jonasjacek.github.io/colors/
# use color names by default to allow dark/light themes to adjust colors based on names
__P9K_COLORS=(
black 000
red 001
green 002
yellow 003
blue 004
magenta 005
cyan 006
white 007
grey 008
maroon 009
lime 010
olive 011
navy 012
fuchsia 013
purple 013
aqua 014
teal 014
silver 015
grey0 016
navyblue 017
darkblue 018
blue3 019
blue3 020
blue1 021
darkgreen 022
deepskyblue4 023
deepskyblue4 024
deepskyblue4 025
dodgerblue3 026
dodgerblue2 027
green4 028
springgreen4 029
turquoise4 030
deepskyblue3 031
deepskyblue3 032
dodgerblue1 033
green3 034
springgreen3 035
darkcyan 036
lightseagreen 037
deepskyblue2 038
deepskyblue1 039
green3 040
springgreen3 041
springgreen2 042
cyan3 043
darkturquoise 044
turquoise2 045
green1 046
springgreen2 047
springgreen1 048
mediumspringgreen 049
cyan2 050
cyan1 051
darkred 052
deeppink4 053
purple4 054
purple4 055
purple3 056
blueviolet 057
orange4 058
grey37 059
mediumpurple4 060
slateblue3 061
slateblue3 062
royalblue1 063
chartreuse4 064
darkseagreen4 065
paleturquoise4 066
steelblue 067
steelblue3 068
cornflowerblue 069
chartreuse3 070
darkseagreen4 071
cadetblue 072
cadetblue 073
skyblue3 074
steelblue1 075
chartreuse3 076
palegreen3 077
seagreen3 078
aquamarine3 079
mediumturquoise 080
steelblue1 081
chartreuse2 082
seagreen2 083
seagreen1 084
seagreen1 085
aquamarine1 086
darkslategray2 087
darkred 088
deeppink4 089
darkmagenta 090
darkmagenta 091
darkviolet 092
purple 093
orange4 094
lightpink4 095
plum4 096
mediumpurple3 097
mediumpurple3 098
slateblue1 099
yellow4 100
wheat4 101
grey53 102
lightslategrey 103
mediumpurple 104
lightslateblue 105
yellow4 106
darkolivegreen3 107
darkseagreen 108
lightskyblue3 109
lightskyblue3 110
skyblue2 111
chartreuse2 112
darkolivegreen3 113
palegreen3 114
darkseagreen3 115
darkslategray3 116
skyblue1 117
chartreuse1 118
lightgreen 119
lightgreen 120
palegreen1 121
aquamarine1 122
darkslategray1 123
red3 124
deeppink4 125
mediumvioletred 126
magenta3 127
darkviolet 128
purple 129
darkorange3 130
indianred 131
hotpink3 132
mediumorchid3 133
mediumorchid 134
mediumpurple2 135
darkgoldenrod 136
lightsalmon3 137
rosybrown 138
grey63 139
mediumpurple2 140
mediumpurple1 141
gold3 142
darkkhaki 143
navajowhite3 144
grey69 145
lightsteelblue3 146
lightsteelblue 147
yellow3 148
darkolivegreen3 149
darkseagreen3 150
darkseagreen2 151
lightcyan3 152
lightskyblue1 153
greenyellow 154
darkolivegreen2 155
palegreen1 156
darkseagreen2 157
darkseagreen1 158
paleturquoise1 159
red3 160
deeppink3 161
deeppink3 162
magenta3 163
magenta3 164
magenta2 165
darkorange3 166
indianred 167
hotpink3 168
hotpink2 169
orchid 170
mediumorchid1 171
orange3 172
lightsalmon3 173
lightpink3 174
pink3 175
plum3 176
violet 177
gold3 178
lightgoldenrod3 179
tan 180
mistyrose3 181
thistle3 182
plum2 183
yellow3 184
khaki3 185
lightgoldenrod2 186
lightyellow3 187
grey84 188
lightsteelblue1 189
yellow2 190
darkolivegreen1 191
darkolivegreen1 192
darkseagreen1 193
honeydew2 194
lightcyan1 195
red1 196
deeppink2 197
deeppink1 198
deeppink1 199
magenta2 200
magenta1 201
orangered1 202
indianred1 203
indianred1 204
hotpink 205
hotpink 206
mediumorchid1 207
darkorange 208
salmon1 209
lightcoral 210
palevioletred1 211
orchid2 212
orchid1 213
orange1 214
sandybrown 215
lightsalmon1 216
lightpink1 217
pink1 218
plum1 219
gold1 220
lightgoldenrod2 221
lightgoldenrod2 222
navajowhite1 223
mistyrose1 224
thistle1 225
yellow1 226
lightgoldenrod1 227
khaki1 228
wheat1 229
cornsilk1 230
grey100 231
grey3 232
grey7 233
grey11 234
grey15 235
grey19 236
grey23 237
grey27 238
grey30 239
grey35 240
grey39 241
grey42 242
grey46 243
grey50 244
grey54 245
grey58 246
grey62 247
grey66 248
grey70 249
grey74 250
grey78 251
grey82 252
grey85 253
grey89 254
grey93 255
)
function termColors() { function termColors() {
if [[ $POWERLEVEL9K_IGNORE_TERM_COLORS == true ]]; then if [[ $POWERLEVEL9K_IGNORE_TERM_COLORS == true ]]; then
return return
@@ -28,339 +292,56 @@ function termColors() {
# get the proper color code if it does not exist as a name. # get the proper color code if it does not exist as a name.
function getColor() { function getColor() {
# no need to check numerical values # If Color is not numerical, try to get the color code.
if [[ "$1" = <-> ]]; then if [[ "$1" != <-> ]]; then
if [[ "$1" = <8-15> ]]; then
1=$(($1 - 8))
fi
else
# named color added to parameter expansion print -P to test if the name exists in terminal
local named="%K{$1}"
# https://misc.flogisoft.com/bash/tip_colors_and_formatting
local default="$'\033'\[49m"
# http://zsh.sourceforge.net/Doc/Release/Prompt-Expansion.html
local quoted=$(printf "%q" $(print -P "$named"))
if [[ $quoted = "$'\033'\[49m" && $1 != "black" ]]; then
# color not found, so try to get the code
1=$(getColorCode $1) 1=$(getColorCode $1)
fi fi
fi
echo -n "$1" echo -n "$1"
} }
# empty paramenter resets (stops) background color # empty paramenter resets (stops) background color
function backgroundColor() { function backgroundColor() {
if [[ -z $1 ]]; then
echo -n "%k"
else
echo -n "%K{$(getColor $1)}" echo -n "%K{$(getColor $1)}"
fi
} }
# empty paramenter resets (stops) foreground color # empty paramenter resets (stops) foreground color
function foregroundColor() { function foregroundColor() {
if [[ -z $1 ]]; then
echo -n "%f"
else
echo -n "%F{$(getColor $1)}" echo -n "%F{$(getColor $1)}"
fi
} }
# Get numerical color codes. That way we translate ANSI codes # Get numerical color codes. That way we translate ANSI codes
# into ZSH-Style color codes. # into ZSH-Style color codes.
function getColorCode() { function getColorCode() {
# Check if given value is already numerical # Early exit: Check if given value is already numerical
if [[ "$1" = <-> ]]; then if [[ "$1" == <-> ]]; then
# ANSI color codes distinguish between "foreground" # Pad color with zeroes
# and "background" colors. We don't need to do that, echo -n "${(l:3::0:)1}"
# as ZSH uses a 256 color space anyway. return
if [[ "$1" = <8-15> ]]; then
echo -n $(($1 - 8))
else
echo -n "$1"
fi fi
else
typeset -A codes
# https://jonasjacek.github.io/colors/
# use color names by default to allow dark/light themes to adjust colors based on names
codes[black]=000
codes[maroon]=001
codes[green]=002
codes[olive]=003
codes[navy]=004
codes[purple]=005
codes[teal]=006
codes[silver]=007
codes[grey]=008
codes[red]=009
codes[lime]=010
codes[yellow]=011
codes[blue]=012
codes[fuchsia]=013
codes[aqua]=014
codes[white]=015
codes[grey0]=016
codes[navyblue]=017
codes[darkblue]=018
codes[blue3]=019
codes[blue3]=020
codes[blue1]=021
codes[darkgreen]=022
codes[deepskyblue4]=023
codes[deepskyblue4]=024
codes[deepskyblue4]=025
codes[dodgerblue3]=026
codes[dodgerblue2]=027
codes[green4]=028
codes[springgreen4]=029
codes[turquoise4]=030
codes[deepskyblue3]=031
codes[deepskyblue3]=032
codes[dodgerblue1]=033
codes[green3]=034
codes[springgreen3]=035
codes[darkcyan]=036
codes[lightseagreen]=037
codes[deepskyblue2]=038
codes[deepskyblue1]=039
codes[green3]=040
codes[springgreen3]=041
codes[springgreen2]=042
codes[cyan3]=043
codes[darkturquoise]=044
codes[turquoise2]=045
codes[green1]=046
codes[springgreen2]=047
codes[springgreen1]=048
codes[mediumspringgreen]=049
codes[cyan2]=050
codes[cyan1]=051
codes[darkred]=052
codes[deeppink4]=053
codes[purple4]=054
codes[purple4]=055
codes[purple3]=056
codes[blueviolet]=057
codes[orange4]=058
codes[grey37]=059
codes[mediumpurple4]=060
codes[slateblue3]=061
codes[slateblue3]=062
codes[royalblue1]=063
codes[chartreuse4]=064
codes[darkseagreen4]=065
codes[paleturquoise4]=066
codes[steelblue]=067
codes[steelblue3]=068
codes[cornflowerblue]=069
codes[chartreuse3]=070
codes[darkseagreen4]=071
codes[cadetblue]=072
codes[cadetblue]=073
codes[skyblue3]=074
codes[steelblue1]=075
codes[chartreuse3]=076
codes[palegreen3]=077
codes[seagreen3]=078
codes[aquamarine3]=079
codes[mediumturquoise]=080
codes[steelblue1]=081
codes[chartreuse2]=082
codes[seagreen2]=083
codes[seagreen1]=084
codes[seagreen1]=085
codes[aquamarine1]=086
codes[darkslategray2]=087
codes[darkred]=088
codes[deeppink4]=089
codes[darkmagenta]=090
codes[darkmagenta]=091
codes[darkviolet]=092
codes[purple]=093
codes[orange4]=094
codes[lightpink4]=095
codes[plum4]=096
codes[mediumpurple3]=097
codes[mediumpurple3]=098
codes[slateblue1]=099
codes[yellow4]=100
codes[wheat4]=101
codes[grey53]=102
codes[lightslategrey]=103
codes[mediumpurple]=104
codes[lightslateblue]=105
codes[yellow4]=106
codes[darkolivegreen3]=107
codes[darkseagreen]=108
codes[lightskyblue3]=109
codes[lightskyblue3]=110
codes[skyblue2]=111
codes[chartreuse2]=112
codes[darkolivegreen3]=113
codes[palegreen3]=114
codes[darkseagreen3]=115
codes[darkslategray3]=116
codes[skyblue1]=117
codes[chartreuse1]=118
codes[lightgreen]=119
codes[lightgreen]=120
codes[palegreen1]=121
codes[aquamarine1]=122
codes[darkslategray1]=123
codes[red3]=124
codes[deeppink4]=125
codes[mediumvioletred]=126
codes[magenta3]=127
codes[darkviolet]=128
codes[purple]=129
codes[darkorange3]=130
codes[indianred]=131
codes[hotpink3]=132
codes[mediumorchid3]=133
codes[mediumorchid]=134
codes[mediumpurple2]=135
codes[darkgoldenrod]=136
codes[lightsalmon3]=137
codes[rosybrown]=138
codes[grey63]=139
codes[mediumpurple2]=140
codes[mediumpurple1]=141
codes[gold3]=142
codes[darkkhaki]=143
codes[navajowhite3]=144
codes[grey69]=145
codes[lightsteelblue3]=146
codes[lightsteelblue]=147
codes[yellow3]=148
codes[darkolivegreen3]=149
codes[darkseagreen3]=150
codes[darkseagreen2]=151
codes[lightcyan3]=152
codes[lightskyblue1]=153
codes[greenyellow]=154
codes[darkolivegreen2]=155
codes[palegreen1]=156
codes[darkseagreen2]=157
codes[darkseagreen1]=158
codes[paleturquoise1]=159
codes[red3]=160
codes[deeppink3]=161
codes[deeppink3]=162
codes[magenta3]=163
codes[magenta3]=164
codes[magenta2]=165
codes[darkorange3]=166
codes[indianred]=167
codes[hotpink3]=168
codes[hotpink2]=169
codes[orchid]=170
codes[mediumorchid1]=171
codes[orange3]=172
codes[lightsalmon3]=173
codes[lightpink3]=174
codes[pink3]=175
codes[plum3]=176
codes[violet]=177
codes[gold3]=178
codes[lightgoldenrod3]=179
codes[tan]=180
codes[mistyrose3]=181
codes[thistle3]=182
codes[plum2]=183
codes[yellow3]=184
codes[khaki3]=185
codes[lightgoldenrod2]=186
codes[lightyellow3]=187
codes[grey84]=188
codes[lightsteelblue1]=189
codes[yellow2]=190
codes[darkolivegreen1]=191
codes[darkolivegreen1]=192
codes[darkseagreen1]=193
codes[honeydew2]=194
codes[lightcyan1]=195
codes[red1]=196
codes[deeppink2]=197
codes[deeppink1]=198
codes[deeppink1]=199
codes[magenta2]=200
codes[magenta1]=201
codes[orangered1]=202
codes[indianred1]=203
codes[indianred1]=204
codes[hotpink]=205
codes[hotpink]=206
codes[mediumorchid1]=207
codes[darkorange]=208
codes[salmon1]=209
codes[lightcoral]=210
codes[palevioletred1]=211
codes[orchid2]=212
codes[orchid1]=213
codes[orange1]=214
codes[sandybrown]=215
codes[lightsalmon1]=216
codes[lightpink1]=217
codes[pink1]=218
codes[plum1]=219
codes[gold1]=220
codes[lightgoldenrod2]=221
codes[lightgoldenrod2]=222
codes[navajowhite1]=223
codes[mistyrose1]=224
codes[thistle1]=225
codes[yellow1]=226
codes[lightgoldenrod1]=227
codes[khaki1]=228
codes[wheat1]=229
codes[cornsilk1]=230
codes[grey100]=231
codes[grey3]=232
codes[grey7]=233
codes[grey11]=234
codes[grey15]=235
codes[grey19]=236
codes[grey23]=237
codes[grey27]=238
codes[grey30]=239
codes[grey35]=240
codes[grey39]=241
codes[grey42]=242
codes[grey46]=243
codes[grey50]=244
codes[grey54]=245
codes[grey58]=246
codes[grey62]=247
codes[grey66]=248
codes[grey70]=249
codes[grey74]=250
codes[grey78]=251
codes[grey82]=252
codes[grey85]=253
codes[grey89]=254
codes[grey93]=255
local colorName="${1}"
# Check if value is none with any case.
if [[ "${(L)colorName}" == "none" ]]; then
echo -n 'none'
elif [[ "${colorName}" == "foreground" ]]; then
# for testing purposes in terminal # for testing purposes in terminal
if [[ "$1" == "foreground" ]]; then
# call via `getColorCode foreground` # call via `getColorCode foreground`
for i in "${(k@)codes}"; do for i in "${(k@)__P9K_COLORS}"; do
print -P "$(foregroundColor $i)$(getColor $i) - $i$(foregroundColor)" print -P "$(foregroundColor $i)$(getColor $i) - $i%f"
done done
elif [[ "$1" == "background" ]]; then elif [[ "${colorName}" == "background" ]]; then
# call via `getColorCode background` # call via `getColorCode background`
for i in "${(k@)codes}"; do for i in "${(k@)__P9K_COLORS}"; do
print -P "$(backgroundColor $i)$(getColor $i) - $i$(backgroundColor)" print -P "$(backgroundColor $i)$(getColor $i) - $i%k"
done done
else else
#[[ -n "$1" ]] bg="%K{$1}" || bg="%k"
# Strip eventual "bg-" prefixes # Strip eventual "bg-" prefixes
1=${1#bg-} colorName=${colorName#bg-}
# Strip eventual "fg-" prefixes # Strip eventual "fg-" prefixes
1=${1#fg-} colorName=${colorName#fg-}
# Strip eventual "br" prefixes ("bright" colors) # Strip eventual "br" prefixes ("bright" colors)
1=${1#br} colorName=${colorName#br}
echo -n $codes[$1] echo -n $__P9K_COLORS[$colorName]
fi
fi fi
} }

View File

@@ -48,24 +48,24 @@ case $POWERLEVEL9K_MODE in
FREEBSD_ICON $'\U1F608 ' # 😈 FREEBSD_ICON $'\U1F608 ' # 😈
ANDROID_ICON $'\uE270' #  ANDROID_ICON $'\uE270' # 
LINUX_ICON $'\uE271' #  LINUX_ICON $'\uE271' # 
LINUX_ARCH_ICON 'Arc' LINUX_ARCH_ICON $'\uE271' # 
LINUX_DEBIAN_ICON 'Deb' LINUX_DEBIAN_ICON $'\uE271' # 
LINUX_UBUNTU_ICON 'Ubu' LINUX_UBUNTU_ICON $'\uE271' # 
LINUX_CENTOS_ICON 'Cen' LINUX_CENTOS_ICON $'\uE271' # 
LINUX_COREOS_ICON 'Cor' LINUX_COREOS_ICON $'\uE271' # 
LINUX_ELEMENTARY_ICON 'Elm' LINUX_ELEMENTARY_ICON $'\uE271' # 
LINUX_MINT_ICON 'LMi' LINUX_MINT_ICON $'\uE271' # 
LINUX_FEDORA_ICON 'Fed' LINUX_FEDORA_ICON $'\uE271' # 
LINUX_GENTOO_ICON 'Gen' LINUX_GENTOO_ICON $'\uE271' # 
LINUX_MAGEIA_ICON 'Mag' LINUX_MAGEIA_ICON $'\uE271' # 
LINUX_NIXOS_ICON 'Nix' LINUX_NIXOS_ICON $'\uE271' # 
LINUX_MANJARO_ICON 'Man' LINUX_MANJARO_ICON $'\uE271' # 
LINUX_DEVUAN_ICON 'Dev' LINUX_DEVUAN_ICON $'\uE271' # 
LINUX_ALPINE_ICON 'Alp' LINUX_ALPINE_ICON $'\uE271' # 
LINUX_AOSC_ICON 'Aos' LINUX_AOSC_ICON $'\uE271' # 
LINUX_OPENSUSE_ICON 'OSu' LINUX_OPENSUSE_ICON $'\uE271' # 
LINUX_SABAYON_ICON 'Sab' LINUX_SABAYON_ICON $'\uE271' # 
LINUX_SLACKWARE_ICON 'Sla' LINUX_SLACKWARE_ICON $'\uE271' # 
SUNOS_ICON $'\U1F31E ' # 🌞 SUNOS_ICON $'\U1F31E ' # 🌞
HOME_ICON $'\uE12C' #  HOME_ICON $'\uE12C' # 
HOME_SUB_ICON $'\uE18D' #  HOME_SUB_ICON $'\uE18D' # 
@@ -148,24 +148,24 @@ case $POWERLEVEL9K_MODE in
FREEBSD_ICON $'\U1F608 ' # 😈 FREEBSD_ICON $'\U1F608 ' # 😈
ANDROID_ICON $'\uE17B' #  ANDROID_ICON $'\uE17B' # 
LINUX_ICON $'\uF17C' #  LINUX_ICON $'\uF17C' # 
LINUX_ARCH_ICON 'Arc' LINUX_ARCH_ICON $'\uF17C' # 
LINUX_DEBIAN_ICON 'Deb' LINUX_DEBIAN_ICON $'\uF17C' # 
LINUX_UBUNTU_ICON 'Ubu' LINUX_UBUNTU_ICON $'\uF17C' # 
LINUX_CENTOS_ICON 'Cen' LINUX_CENTOS_ICON $'\uF17C' # 
LINUX_COREOS_ICON 'Cor' LINUX_COREOS_ICON $'\uF17C' # 
LINUX_ELEMENTARY_ICON 'Elm' LINUX_ELEMENTARY_ICON $'\uF17C' # 
LINUX_MINT_ICON 'LMi' LINUX_MINT_ICON $'\uF17C' # 
LINUX_FEDORA_ICON 'Fed' LINUX_FEDORA_ICON $'\uF17C' # 
LINUX_GENTOO_ICON 'Gen' LINUX_GENTOO_ICON $'\uF17C' # 
LINUX_MAGEIA_ICON 'Mag' LINUX_MAGEIA_ICON $'\uF17C' # 
LINUX_NIXOS_ICON 'Nix' LINUX_NIXOS_ICON $'\uF17C' # 
LINUX_MANJARO_ICON 'Man' LINUX_MANJARO_ICON $'\uF17C' # 
LINUX_DEVUAN_ICON 'Dev' LINUX_DEVUAN_ICON $'\uF17C' # 
LINUX_ALPINE_ICON 'Alp' LINUX_ALPINE_ICON $'\uF17C' # 
LINUX_AOSC_ICON 'Aos' LINUX_AOSC_ICON $'\uF17C' # 
LINUX_OPENSUSE_ICON 'OSu' LINUX_OPENSUSE_ICON $'\uF17C' # 
LINUX_SABAYON_ICON 'Sab' LINUX_SABAYON_ICON $'\uF17C' # 
LINUX_SLACKWARE_ICON 'Sla' LINUX_SLACKWARE_ICON $'\uF17C' # 
SUNOS_ICON $'\uF185 ' #  SUNOS_ICON $'\uF185 ' # 
HOME_ICON $'\uF015' #  HOME_ICON $'\uF015' # 
HOME_SUB_ICON $'\uF07C' #  HOME_SUB_ICON $'\uF07C' # 
@@ -250,24 +250,24 @@ case $POWERLEVEL9K_MODE in
APPLE_ICON '\u'$CODEPOINT_OF_AWESOME_APPLE #  APPLE_ICON '\u'$CODEPOINT_OF_AWESOME_APPLE # 
FREEBSD_ICON $'\U1F608 ' # 😈 FREEBSD_ICON $'\U1F608 ' # 😈
LINUX_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX #  LINUX_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_ARCH_ICON 'Arc' LINUX_ARCH_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_DEBIAN_ICON 'Deb' LINUX_DEBIAN_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_UBUNTU_ICON 'Ubu' LINUX_UBUNTU_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_CENTOS_ICON 'Cen' LINUX_CENTOS_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_COREOS_ICON 'Cor' LINUX_COREOS_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_ELEMENTARY_ICON 'Elm' LINUX_ELEMENTARY_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_MINT_ICON 'LMi' LINUX_MINT_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_FEDORA_ICON 'Fed' LINUX_FEDORA_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_GENTOO_ICON 'Gen' LINUX_GENTOO_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_MAGEIA_ICON 'Mag' LINUX_MAGEIA_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_NIXOS_ICON 'Nix' LINUX_NIXOS_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_MANJARO_ICON 'Man' LINUX_MANJARO_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_DEVUAN_ICON 'Dev' LINUX_DEVUAN_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_ALPINE_ICON 'Alp' LINUX_ALPINE_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_AOSC_ICON 'Aos' LINUX_AOSC_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_OPENSUSE_ICON 'OSu' LINUX_OPENSUSE_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_SABAYON_ICON 'Sab' LINUX_SABAYON_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
LINUX_SLACKWARE_ICON 'Sla' LINUX_SLACKWARE_ICON '\u'$CODEPOINT_OF_AWESOME_LINUX # 
SUNOS_ICON '\u'$CODEPOINT_OF_AWESOME_SUN_O' ' #  SUNOS_ICON '\u'$CODEPOINT_OF_AWESOME_SUN_O' ' # 
HOME_ICON '\u'$CODEPOINT_OF_AWESOME_HOME #  HOME_ICON '\u'$CODEPOINT_OF_AWESOME_HOME # 
HOME_SUB_ICON '\u'$CODEPOINT_OF_AWESOME_FOLDER_OPEN #  HOME_SUB_ICON '\u'$CODEPOINT_OF_AWESOME_FOLDER_OPEN # 

View File

@@ -9,9 +9,7 @@
# Exits with 0 if a variable has been previously defined (even if empty) # Exits with 0 if a variable has been previously defined (even if empty)
# Takes the name of a variable that should be checked. # Takes the name of a variable that should be checked.
function defined() { function defined() {
local varname="$1" [[ ! -z "${(tP)1}" ]]
typeset -p "$varname" > /dev/null 2>&1
} }
# Given the name of a variable and a default value, sets the variable # Given the name of a variable and a default value, sets the variable

View File

@@ -111,45 +111,54 @@ CURRENT_BG='NONE'
set_default last_left_element_index 1 set_default last_left_element_index 1
set_default POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS " " set_default POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS " "
left_prompt_segment() { left_prompt_segment() {
local segment_name="${1}"
local current_index=$2 local current_index=$2
# Check if the segment should be joined with the previous one # Check if the segment should be joined with the previous one
local joined local joined
segmentShouldBeJoined $current_index $last_left_element_index "$POWERLEVEL9K_LEFT_PROMPT_ELEMENTS" && joined=true || joined=false segmentShouldBeJoined $current_index $last_left_element_index "$POWERLEVEL9K_LEFT_PROMPT_ELEMENTS" && joined=true || joined=false
# Colors
local backgroundColor="${3}"
local foregroundColor="${4}"
# Overwrite given background-color by user defined variable for this segment. # Overwrite given background-color by user defined variable for this segment.
local BACKGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)1#prompt_}_BACKGROUND local BACKGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)${segment_name}#prompt_}_BACKGROUND
local BG_COLOR_MODIFIER=${(P)BACKGROUND_USER_VARIABLE} local BG_COLOR_MODIFIER=${(P)BACKGROUND_USER_VARIABLE}
[[ -n $BG_COLOR_MODIFIER ]] && 3="$BG_COLOR_MODIFIER" [[ -n $BG_COLOR_MODIFIER ]] && backgroundColor="$BG_COLOR_MODIFIER"
# Overwrite given foreground-color by user defined variable for this segment. # Overwrite given foreground-color by user defined variable for this segment.
local FOREGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)1#prompt_}_FOREGROUND local FOREGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)${segment_name}#prompt_}_FOREGROUND
local FG_COLOR_MODIFIER=${(P)FOREGROUND_USER_VARIABLE} local FG_COLOR_MODIFIER=${(P)FOREGROUND_USER_VARIABLE}
[[ -n $FG_COLOR_MODIFIER ]] && 4="$FG_COLOR_MODIFIER" [[ -n $FG_COLOR_MODIFIER ]] && foregroundColor="$FG_COLOR_MODIFIER"
local bg fg # Get color codes here to save some calls later on
[[ -n "$3" ]] && bg="$(backgroundColor $3)" || bg="$(backgroundColor)" backgroundColor="$(getColorCode ${backgroundColor})"
[[ -n "$4" ]] && fg="$(foregroundColor $4)" || fg="$(foregroundColor)" foregroundColor="$(getColorCode ${foregroundColor})"
if [[ $CURRENT_BG != 'NONE' ]] && ! isSameColor "$3" "$CURRENT_BG"; then local background foreground
echo -n "$bg%F{$CURRENT_BG}" [[ -n "${backgroundColor}" ]] && background="$(backgroundColor ${backgroundColor})" || background="%k"
[[ -n "${foregroundColor}" ]] && foreground="$(foregroundColor ${foregroundColor})" || foreground="%f"
if [[ $CURRENT_BG != 'NONE' ]] && ! isSameColor "${backgroundColor}" "$CURRENT_BG"; then
echo -n "${background}%F{$CURRENT_BG}"
if [[ $joined == false ]]; then if [[ $joined == false ]]; then
# Middle segment # Middle segment
echo -n "$(print_icon 'LEFT_SEGMENT_SEPARATOR')$POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS" echo -n "$(print_icon 'LEFT_SEGMENT_SEPARATOR')$POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS"
fi fi
elif isSameColor "$CURRENT_BG" "$3"; then elif isSameColor "$CURRENT_BG" "${backgroundColor}"; then
# Middle segment with same color as previous segment # Middle segment with same color as previous segment
# We take the current foreground color as color for our # We take the current foreground color as color for our
# subsegment (or the default color). This should have # subsegment (or the default color). This should have
# enough contrast. # enough contrast.
local complement local complement
[[ -n "$4" ]] && complement="$fg" || complement="$(foregroundColor $DEFAULT_COLOR)" [[ -n "${foregroundColor}" ]] && complement="${foreground}" || complement="$(foregroundColor $DEFAULT_COLOR)"
echo -n "${bg}${complement}" echo -n "${background}${complement}"
if [[ $joined == false ]]; then if [[ $joined == false ]]; then
echo -n "$(print_icon 'LEFT_SUBSEGMENT_SEPARATOR')$POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS" echo -n "$(print_icon 'LEFT_SUBSEGMENT_SEPARATOR')$POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS"
fi fi
else else
# First segment # First segment
echo -n "${bg}$POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS" echo -n "${background}$POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS"
fi fi
local visual_identifier local visual_identifier
@@ -161,26 +170,26 @@ left_prompt_segment() {
# we need to color both the visual identifier and the whitespace. # we need to color both the visual identifier and the whitespace.
[[ -n "$5" ]] && visual_identifier="$visual_identifier " [[ -n "$5" ]] && visual_identifier="$visual_identifier "
# Allow users to overwrite the color for the visual identifier only. # Allow users to overwrite the color for the visual identifier only.
local visual_identifier_color_variable=POWERLEVEL9K_${(U)1#prompt_}_VISUAL_IDENTIFIER_COLOR local visual_identifier_color_variable=POWERLEVEL9K_${(U)${segment_name}#prompt_}_VISUAL_IDENTIFIER_COLOR
set_default $visual_identifier_color_variable $4 set_default $visual_identifier_color_variable "${foregroundColor}"
visual_identifier="%F{${(P)visual_identifier_color_variable}%}$visual_identifier%f" visual_identifier="$(foregroundColor ${(P)visual_identifier_color_variable})${visual_identifier}%f"
fi fi
fi fi
# Print the visual identifier # Print the visual identifier
echo -n "${visual_identifier}" echo -n "${visual_identifier}"
# Print the content of the segment, if there is any # Print the content of the segment, if there is any
[[ -n "$5" ]] && echo -n "${fg}${5}" [[ -n "$5" ]] && echo -n "${foreground}${5}"
echo -n "${POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS}" echo -n "${POWERLEVEL9K_WHITESPACE_BETWEEN_LEFT_SEGMENTS}"
CURRENT_BG=$3 CURRENT_BG="${backgroundColor}"
last_left_element_index=$current_index last_left_element_index=$current_index
} }
# End the left prompt, closes the final segment. # End the left prompt, closes the final segment.
left_prompt_end() { left_prompt_end() {
if [[ -n $CURRENT_BG ]]; then if [[ -n $CURRENT_BG ]]; then
echo -n "%k%F{$CURRENT_BG}$(print_icon 'LEFT_SEGMENT_SEPARATOR')" echo -n "%k$(foregroundColor ${CURRENT_BG})$(print_icon 'LEFT_SEGMENT_SEPARATOR')"
else else
echo -n "%k" echo -n "%k"
fi fi
@@ -203,25 +212,34 @@ CURRENT_RIGHT_BG='NONE'
set_default last_right_element_index 1 set_default last_right_element_index 1
set_default POWERLEVEL9K_WHITESPACE_BETWEEN_RIGHT_SEGMENTS " " set_default POWERLEVEL9K_WHITESPACE_BETWEEN_RIGHT_SEGMENTS " "
right_prompt_segment() { right_prompt_segment() {
local segment_name="${1}"
local current_index=$2 local current_index=$2
# Check if the segment should be joined with the previous one # Check if the segment should be joined with the previous one
local joined local joined
segmentShouldBeJoined $current_index $last_right_element_index "$POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS" && joined=true || joined=false segmentShouldBeJoined $current_index $last_right_element_index "$POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS" && joined=true || joined=false
# Colors
local backgroundColor="${3}"
local foregroundColor="${4}"
# Overwrite given background-color by user defined variable for this segment. # Overwrite given background-color by user defined variable for this segment.
local BACKGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)1#prompt_}_BACKGROUND local BACKGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)${segment_name}#prompt_}_BACKGROUND
local BG_COLOR_MODIFIER=${(P)BACKGROUND_USER_VARIABLE} local BG_COLOR_MODIFIER=${(P)BACKGROUND_USER_VARIABLE}
[[ -n $BG_COLOR_MODIFIER ]] && 3="$BG_COLOR_MODIFIER" [[ -n $BG_COLOR_MODIFIER ]] && backgroundColor="$BG_COLOR_MODIFIER"
# Overwrite given foreground-color by user defined variable for this segment. # Overwrite given foreground-color by user defined variable for this segment.
local FOREGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)1#prompt_}_FOREGROUND local FOREGROUND_USER_VARIABLE=POWERLEVEL9K_${(U)${segment_name}#prompt_}_FOREGROUND
local FG_COLOR_MODIFIER=${(P)FOREGROUND_USER_VARIABLE} local FG_COLOR_MODIFIER=${(P)FOREGROUND_USER_VARIABLE}
[[ -n $FG_COLOR_MODIFIER ]] && 4="$FG_COLOR_MODIFIER" [[ -n $FG_COLOR_MODIFIER ]] && foregroundColor="$FG_COLOR_MODIFIER"
local bg fg # Get color codes here to save some calls later on
[[ -n "$3" ]] && bg="$(backgroundColor $3)" || bg="$(backgroundColor)" backgroundColor="$(getColorCode ${backgroundColor})"
[[ -n "$4" ]] && fg="$(foregroundColor $4)" || fg="$(foregroundColor)" foregroundColor="$(getColorCode ${foregroundColor})"
local background foreground
[[ -n "${backgroundColor}" ]] && background="$(backgroundColor ${backgroundColor})" || background="%k"
[[ -n "${foregroundColor}" ]] && foreground="$(foregroundColor ${foregroundColor})" || foreground="%f"
# If CURRENT_RIGHT_BG is "NONE", we are the first right segment. # If CURRENT_RIGHT_BG is "NONE", we are the first right segment.
@@ -231,17 +249,17 @@ right_prompt_segment() {
fi fi
if [[ $joined == false ]] || [[ "$CURRENT_RIGHT_BG" == "NONE" ]]; then if [[ $joined == false ]] || [[ "$CURRENT_RIGHT_BG" == "NONE" ]]; then
if isSameColor "$CURRENT_RIGHT_BG" "$3"; then if isSameColor "$CURRENT_RIGHT_BG" "${backgroundColor}"; then
# Middle segment with same color as previous segment # Middle segment with same color as previous segment
# We take the current foreground color as color for our # We take the current foreground color as color for our
# subsegment (or the default color). This should have # subsegment (or the default color). This should have
# enough contrast. # enough contrast.
local complement local complement
[[ -n "$4" ]] && complement="$fg" || complement="$(foregroundColor $DEFAULT_COLOR)" [[ -n "${foregroundColor}" ]] && complement="${foreground}" || complement="$(foregroundColor $DEFAULT_COLOR)"
echo -n "$complement$(print_icon 'RIGHT_SUBSEGMENT_SEPARATOR')%f" echo -n "$complement$(print_icon 'RIGHT_SUBSEGMENT_SEPARATOR')%f"
else else
# Use the new BG color for the foreground with separator # Use the new Background Color as the foreground of the segment separator
echo -n "$(foregroundColor $3)$(print_icon 'RIGHT_SEGMENT_SEPARATOR')%f" echo -n "$(foregroundColor ${backgroundColor})$(print_icon 'RIGHT_SEGMENT_SEPARATOR')%f"
fi fi
fi fi
@@ -254,13 +272,13 @@ right_prompt_segment() {
# we need to color both the visual identifier and the whitespace. # we need to color both the visual identifier and the whitespace.
[[ -n "$5" ]] && visual_identifier=" $visual_identifier" [[ -n "$5" ]] && visual_identifier=" $visual_identifier"
# Allow users to overwrite the color for the visual identifier only. # Allow users to overwrite the color for the visual identifier only.
local visual_identifier_color_variable=POWERLEVEL9K_${(U)1#prompt_}_VISUAL_IDENTIFIER_COLOR local visual_identifier_color_variable=POWERLEVEL9K_${(U)${segment_name}#prompt_}_VISUAL_IDENTIFIER_COLOR
set_default $visual_identifier_color_variable $4 set_default $visual_identifier_color_variable "${foregroundColor}"
visual_identifier="%F{${(P)visual_identifier_color_variable}%}$visual_identifier%f" visual_identifier="$(foregroundColor ${(P)visual_identifier_color_variable})${visual_identifier}%f"
fi fi
fi fi
echo -n "${bg}${fg}" echo -n "${background}${foreground}"
# Print whitespace only if segment is not joined or first right segment # Print whitespace only if segment is not joined or first right segment
[[ $joined == false ]] || [[ "$CURRENT_RIGHT_BG" == "NONE" ]] && echo -n "${POWERLEVEL9K_WHITESPACE_BETWEEN_RIGHT_SEGMENTS}" [[ $joined == false ]] || [[ "$CURRENT_RIGHT_BG" == "NONE" ]] && echo -n "${POWERLEVEL9K_WHITESPACE_BETWEEN_RIGHT_SEGMENTS}"
@@ -270,7 +288,7 @@ right_prompt_segment() {
# Print the visual identifier # Print the visual identifier
echo -n "${visual_identifier}" echo -n "${visual_identifier}"
CURRENT_RIGHT_BG=$3 CURRENT_RIGHT_BG="${backgroundColor}"
last_right_element_index=$current_index last_right_element_index=$current_index
} }
@@ -278,11 +296,6 @@ right_prompt_segment() {
# Prompt Segment Definitions # Prompt Segment Definitions
################################################################ ################################################################
# The `CURRENT_BG` variable is used to remember what the last BG color used was
# when building the left-hand prompt. Because the RPROMPT is created from
# right-left but reads the opposite, this isn't necessary for the other side.
CURRENT_BG='NONE'
################################################################ ################################################################
# Anaconda Environment # Anaconda Environment
prompt_anaconda() { prompt_anaconda() {
@@ -409,12 +422,13 @@ prompt_battery() {
'charged' 'green' 'charged' 'green'
'disconnected' "$DEFAULT_COLOR_INVERTED" 'disconnected' "$DEFAULT_COLOR_INVERTED"
) )
local ROOT_PREFIX="${4}"
# Set default values if the user did not configure them # Set default values if the user did not configure them
set_default POWERLEVEL9K_BATTERY_LOW_THRESHOLD 10 set_default POWERLEVEL9K_BATTERY_LOW_THRESHOLD 10
if [[ $OS =~ OSX && -f /usr/bin/pmset && -x /usr/bin/pmset ]]; then if [[ $OS =~ OSX && -f "${ROOT_PREFIX}"/usr/bin/pmset && -x "${ROOT_PREFIX}"/usr/bin/pmset ]]; then
# obtain battery information from system # obtain battery information from system
local raw_data="$(pmset -g batt | awk 'FNR==2{print}')" local raw_data="$(${ROOT_PREFIX}/usr/bin/pmset -g batt | awk 'FNR==2{print}')"
# return if there is no battery on system # return if there is no battery on system
[[ -z $(echo $raw_data | grep "InternalBattery") ]] && return [[ -z $(echo $raw_data | grep "InternalBattery") ]] && return
@@ -446,7 +460,7 @@ prompt_battery() {
fi fi
if [[ "$OS" == 'Linux' ]] || [[ "$OS" == 'Android' ]]; then if [[ "$OS" == 'Linux' ]] || [[ "$OS" == 'Android' ]]; then
local sysp="/sys/class/power_supply" local sysp="${ROOT_PREFIX}/sys/class/power_supply"
# Reported BAT0 or BAT1 depending on kernel version # Reported BAT0 or BAT1 depending on kernel version
[[ -a $sysp/BAT0 ]] && local bat=$sysp/BAT0 [[ -a $sysp/BAT0 ]] && local bat=$sysp/BAT0
@@ -468,8 +482,8 @@ prompt_battery() {
[[ $bat_percent =~ 100 ]] && current_state="charged" [[ $bat_percent =~ 100 ]] && current_state="charged"
[[ $bat_percent -lt 100 ]] && current_state="charging" [[ $bat_percent -lt 100 ]] && current_state="charging"
fi fi
if [[ -f /usr/bin/acpi ]]; then if [[ -f ${ROOT_PREFIX}/usr/bin/acpi ]]; then
local time_remaining=$(acpi | awk '{ print $5 }') local time_remaining=$(${ROOT_PREFIX}/usr/bin/acpi | awk '{ print $5 }')
if [[ $time_remaining =~ rate ]]; then if [[ $time_remaining =~ rate ]]; then
local tstring="..." local tstring="..."
elif [[ $time_remaining =~ "[[:digit:]]+" ]]; then elif [[ $time_remaining =~ "[[:digit:]]+" ]]; then
@@ -498,7 +512,7 @@ prompt_battery() {
fi fi
fi fi
# return if POWERLEVEL9K_BATTERY_HIDE_ABOVE_THRESHOLD is set and the battery percentage is greater or equal # return if POWERLEVEL9K_BATTERY_HIDE_ABOVE_THRESHOLD is set and the battery percentage is greater or equal
if [[ -v "POWERLEVEL9K_BATTERY_HIDE_ABOVE_THRESHOLD" && "${bat_percent}" -ge $POWERLEVEL9K_BATTERY_HIDE_ABOVE_THRESHOLD ]]; then if defined POWERLEVEL9K_BATTERY_HIDE_ABOVE_THRESHOLD && [[ "${bat_percent}" -ge $POWERLEVEL9K_BATTERY_HIDE_ABOVE_THRESHOLD ]]; then
return return
fi fi
@@ -623,12 +637,12 @@ prompt_context() {
if [[ $(print -P "%#") == '#' ]]; then if [[ $(print -P "%#") == '#' ]]; then
current_state="ROOT" current_state="ROOT"
elif [[ -n "$SSH_CLIENT" || -n "$SSH_TTY" ]]; then elif [[ -n "$SSH_CLIENT" || -n "$SSH_TTY" ]]; then
if sudo -n true 2>/dev/null; then if [[ -n "$SUDO_COMMAND" ]]; then
current_state="REMOTE_SUDO" current_state="REMOTE_SUDO"
else else
current_state="REMOTE" current_state="REMOTE"
fi fi
elif sudo -n true 2>/dev/null; then elif [[ -n "$SUDO_COMMAND" ]]; then
current_state="SUDO" current_state="SUDO"
fi fi
@@ -651,7 +665,7 @@ prompt_user() {
"FOREGROUND_COLOR" "yellow" "FOREGROUND_COLOR" "yellow"
"VISUAL_IDENTIFIER" "ROOT_ICON" "VISUAL_IDENTIFIER" "ROOT_ICON"
) )
elif sudo -n true 2>/dev/null; then elif [[ -n "$SUDO_COMMAND" ]]; then
user_state=( user_state=(
"STATE" "SUDO" "STATE" "SUDO"
"CONTENT" "${POWERLEVEL9K_USER_TEMPLATE}" "CONTENT" "${POWERLEVEL9K_USER_TEMPLATE}"
@@ -702,11 +716,13 @@ prompt_host() {
# The 'custom` prompt provides a way for users to invoke commands and display # The 'custom` prompt provides a way for users to invoke commands and display
# the output in a segment. # the output in a segment.
prompt_custom() { prompt_custom() {
local command=POWERLEVEL9K_CUSTOM_$3:u local segment_name="${3:u}"
# Get content of custom segment
local command="POWERLEVEL9K_CUSTOM_${segment_name}"
local segment_content="$(eval ${(P)command})" local segment_content="$(eval ${(P)command})"
if [[ -n $segment_content ]]; then if [[ -n $segment_content ]]; then
"$1_prompt_segment" "${0}_${3:u}" "$2" $DEFAULT_COLOR_INVERTED $DEFAULT_COLOR "$segment_content" "$1_prompt_segment" "${0}_${3:u}" "$2" $DEFAULT_COLOR_INVERTED $DEFAULT_COLOR "$segment_content" "CUSTOM_${segment_name}_ICON"
fi fi
} }
@@ -744,6 +760,31 @@ prompt_command_execution_time() {
fi fi
} }
################################################################
# Determine the unique path - this is needed for the
# truncate_to_unique strategy.
#
function getUniqueFolder() {
local trunc_path directory test_dir test_dir_length
local -a matching
local -a paths
local cur_path='/'
paths=(${(s:/:)1})
for directory in ${paths[@]}; do
test_dir=''
for (( i=0; i < ${#directory}; i++ )); do
test_dir+="${directory:$i:1}"
matching=("$cur_path"/"$test_dir"*/)
if [[ ${#matching[@]} -eq 1 ]]; then
break
fi
done
trunc_path+="$test_dir/"
cur_path+="$directory/"
done
echo "${trunc_path: : -1}"
}
################################################################ ################################################################
# Dir: current working directory # Dir: current working directory
# Parameters: # Parameters:
@@ -811,23 +852,10 @@ prompt_dir() {
# for each parent path component find the shortest unique beginning # for each parent path component find the shortest unique beginning
# characters sequence. Source: https://stackoverflow.com/a/45336078 # characters sequence. Source: https://stackoverflow.com/a/45336078
if (( ${#current_path} > 1 )); then # root and home are exceptions and won't have paths if (( ${#current_path} > 1 )); then # root and home are exceptions and won't have paths
local matching # cheating here to retain ~ as home folder
local cur_path='/' local home_path="$(getUniqueFolder $HOME)"
[[ $current_path != "~"* ]] && trunc_path='/' || trunc_path='' trunc_path="$(getUniqueFolder $PWD)"
for directory in ${paths[@]}; do [[ $current_path == "~"* ]] && current_path="~${trunc_path//${home_path}/}" || current_path="/${trunc_path}"
test_dir=''
for (( i=0; i<${#directory}; i++ )); do
test_dir+="${directory:$i:1}"
matching=("$cur_path"/"$test_dir"*/)
if [[ ${#matching[@]} -eq 1 ]]; then
break
fi
done
trunc_path+="$test_dir/"
cur_path+="$directory/"
done
[[ $current_path == "~"* ]] && trunc_path="~/$trunc_path"
current_path="${trunc_path: : -1}"
fi fi
;; ;;
truncate_with_folder_marker) truncate_with_folder_marker)
@@ -1043,18 +1071,14 @@ prompt_history() {
################################################################ ################################################################
# Detection for virtualization (systemd based systems only) # Detection for virtualization (systemd based systems only)
prompt_detect_virt() { prompt_detect_virt() {
if ! command -v systemd-detect-virt > /dev/null; then local virt=$(systemd-detect-virt 2> /dev/null)
return
fi
local virt=$(systemd-detect-virt)
if [[ "$virt" == "none" ]]; then if [[ "$virt" == "none" ]]; then
if [[ "$(ls -di / | grep -o 2)" != "2" ]]; then if [[ "$(ls -di / | grep -o 2)" != "2" ]]; then
virt="chroot" virt="chroot"
"$1_prompt_segment" "$0" "$2" "$DEFAULT_COLOR" "yellow" "$virt"
else
;
fi fi
else fi
if [[ -n "${virt}" ]]; then
"$1_prompt_segment" "$0" "$2" "$DEFAULT_COLOR" "yellow" "$virt" "$1_prompt_segment" "$0" "$2" "$DEFAULT_COLOR" "yellow" "$virt"
fi fi
} }
@@ -1089,18 +1113,20 @@ prompt_ip() {
else else
if defined POWERLEVEL9K_IP_INTERFACE; then if defined POWERLEVEL9K_IP_INTERFACE; then
# Get the IP address of the specified interface. # Get the IP address of the specified interface.
ip=$(ip -4 a show "$POWERLEVEL9K_IP_INTERFACE" | grep -o "inet\s*[0-9.]*" | grep -o "[0-9.]*") ip=$(ip -4 a show "$POWERLEVEL9K_IP_INTERFACE" | grep -o "inet\s*[0-9.]*" | grep -o -E "[0-9.]+")
else else
local interfaces callback local interfaces callback
# Get all network interface names that are up # Get all network interface names that are up
interfaces=$(ip link ls up | grep -o -E ":\s+[a-z0-9]+:" | grep -v "lo" | grep -o "[a-z0-9]*") interfaces=$(ip link ls up | grep -o -E ":\s+[a-z0-9]+:" | grep -v "lo" | grep -o -E "[a-z0-9]+")
callback='ip -4 a show $item | grep -o "inet\s*[0-9.]*" | grep -o "[0-9.]*"' callback='ip -4 a show $item | grep -o "inet\s*[0-9.]*" | grep -o -E "[0-9.]+"'
ip=$(getRelevantItem "$interfaces" "$callback") ip=$(getRelevantItem "$interfaces" "$callback")
fi fi
fi fi
if [[ -n "$ip" ]]; then
"$1_prompt_segment" "$0" "$2" "cyan" "$DEFAULT_COLOR" "$ip" 'NETWORK_ICON' "$1_prompt_segment" "$0" "$2" "cyan" "$DEFAULT_COLOR" "$ip" 'NETWORK_ICON'
fi
} }
################################################################ ################################################################
@@ -1119,10 +1145,9 @@ prompt_vpn_ip() {
# Segment to display laravel version # Segment to display laravel version
prompt_laravel_version() { prompt_laravel_version() {
local laravel_version="$(php artisan --version 2> /dev/null)" local laravel_version="$(php artisan --version 2> /dev/null)"
if [[ -n "${laravel_version}" ]]; then if [[ -n "${laravel_version}" && "${laravel_version}" =~ "Laravel Framework" ]]; then
# Remove unrelevant infos # Strip out everything but the version
laravel_version="${laravel_version//Laravel Framework version /}" laravel_version="${laravel_version//Laravel Framework /}"
"$1_prompt_segment" "$0" "$2" "maroon" "white" "${laravel_version}" 'LARAVEL_ICON' "$1_prompt_segment" "$0" "$2" "maroon" "white" "${laravel_version}" 'LARAVEL_ICON'
fi fi
} }
@@ -1131,6 +1156,7 @@ prompt_laravel_version() {
# Segment to display load # Segment to display load
set_default POWERLEVEL9K_LOAD_WHICH 5 set_default POWERLEVEL9K_LOAD_WHICH 5
prompt_load() { prompt_load() {
local ROOT_PREFIX="${4}"
# The load segment can have three different states # The load segment can have three different states
local current_state="unknown" local current_state="unknown"
local load_select=2 local load_select=2
@@ -1166,7 +1192,7 @@ prompt_load() {
fi fi
;; ;;
*) *)
load_avg=$(cut -d" " -f${load_select} /proc/loadavg) load_avg=$(cut -d" " -f${load_select} ${ROOT_PREFIX}/proc/loadavg)
cores=$(nproc) cores=$(nproc)
esac esac
@@ -1239,6 +1265,7 @@ prompt_php_version() {
################################################################ ################################################################
# Segment to display free RAM and used Swap # Segment to display free RAM and used Swap
prompt_ram() { prompt_ram() {
local ROOT_PREFIX="${4}"
local base='' local base=''
local ramfree=0 local ramfree=0
if [[ "$OS" == "OSX" ]]; then if [[ "$OS" == "OSX" ]]; then
@@ -1250,9 +1277,9 @@ prompt_ram() {
ramfree=$(( ramfree * 4096 )) ramfree=$(( ramfree * 4096 ))
else else
if [[ "$OS" == "BSD" ]]; then if [[ "$OS" == "BSD" ]]; then
ramfree=$(grep 'avail memory' /var/run/dmesg.boot | awk '{print $4}') ramfree=$(grep 'avail memory' ${ROOT_PREFIX}/var/run/dmesg.boot | awk '{print $4}')
else else
ramfree=$(grep -o -E "MemAvailable:\s+[0-9]+" /proc/meminfo | grep -o "[0-9]*") ramfree=$(grep -o -E "MemAvailable:\s+[0-9]+" ${ROOT_PREFIX}/proc/meminfo | grep -o -E "[0-9]+")
base='K' base='K'
fi fi
fi fi
@@ -1260,22 +1287,20 @@ prompt_ram() {
"$1_prompt_segment" "$0" "$2" "yellow" "$DEFAULT_COLOR" "$(printSizeHumanReadable "$ramfree" $base)" 'RAM_ICON' "$1_prompt_segment" "$0" "$2" "yellow" "$DEFAULT_COLOR" "$(printSizeHumanReadable "$ramfree" $base)" 'RAM_ICON'
} }
################################################################
# Segment to display rbenv information
# https://github.com/rbenv/rbenv#choosing-the-ruby-version
set_default POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW false set_default POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW false
# rbenv information
prompt_rbenv() { prompt_rbenv() {
if command which rbenv 2>/dev/null >&2; then if [[ -n "$RBENV_VERSION" ]]; then
"$1_prompt_segment" "$0" "$2" "red" "$DEFAULT_COLOR" "$RBENV_VERSION" 'RUBY_ICON'
elif [ $commands[rbenv] ]; then
local rbenv_version_name="$(rbenv version-name)" local rbenv_version_name="$(rbenv version-name)"
local rbenv_global="$(rbenv global)" local rbenv_global="$(rbenv global)"
if [[ "${rbenv_version_name}" != "${rbenv_global}" || "${POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW}" == "true" ]]; then
# Don't show anything if the current Ruby is the same as the global Ruby
# unless `POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW` is set.
if [[ $rbenv_version_name == $rbenv_global && "$POWERLEVEL9K_RBENV_PROMPT_ALWAYS_SHOW" = false ]]; then
return
fi
"$1_prompt_segment" "$0" "$2" "red" "$DEFAULT_COLOR" "$rbenv_version_name" 'RUBY_ICON' "$1_prompt_segment" "$0" "$2" "red" "$DEFAULT_COLOR" "$rbenv_version_name" 'RUBY_ICON'
fi fi
fi
} }
################################################################ ################################################################
@@ -1419,6 +1444,7 @@ prompt_status() {
################################################################ ################################################################
# Segment to display Swap information # Segment to display Swap information
prompt_swap() { prompt_swap() {
local ROOT_PREFIX="${4}"
local swap_used=0 local swap_used=0
local base='' local base=''
@@ -1433,8 +1459,8 @@ prompt_swap() {
base=$(echo "$raw_swap_used" | grep -o "[A-Z]*$") base=$(echo "$raw_swap_used" | grep -o "[A-Z]*$")
else else
swap_total=$(grep -o -E "SwapTotal:\s+[0-9]+" /proc/meminfo | grep -o "[0-9]*") swap_total=$(grep -o -E "SwapTotal:\s+[0-9]+" ${ROOT_PREFIX}/proc/meminfo | grep -o -E "[0-9]+")
swap_free=$(grep -o -E "SwapFree:\s+[0-9]+" /proc/meminfo | grep -o "[0-9]*") swap_free=$(grep -o -E "SwapFree:\s+[0-9]+" ${ROOT_PREFIX}/proc/meminfo | grep -o -E "[0-9]+")
swap_used=$(( swap_total - swap_free )) swap_used=$(( swap_total - swap_free ))
base='K' base='K'
fi fi
@@ -1607,7 +1633,7 @@ set_default POWERLEVEL9K_VI_COMMAND_MODE_STRING "NORMAL"
prompt_vi_mode() { prompt_vi_mode() {
case ${KEYMAP} in case ${KEYMAP} in
vicmd) vicmd)
"$1_prompt_segment" "$0_NORMAL" "$2" "$DEFAULT_COLOR" "default" "$POWERLEVEL9K_VI_COMMAND_MODE_STRING" "$1_prompt_segment" "$0_NORMAL" "$2" "$DEFAULT_COLOR" "white" "$POWERLEVEL9K_VI_COMMAND_MODE_STRING"
;; ;;
main|viins|*) main|viins|*)
if [[ -z $POWERLEVEL9K_VI_INSERT_MODE_STRING ]]; then return; fi if [[ -z $POWERLEVEL9K_VI_INSERT_MODE_STRING ]]; then return; fi
@@ -1628,11 +1654,22 @@ prompt_virtualenv() {
} }
################################################################ ################################################################
# pyenv: current active python version (with restrictions) # Segment to display pyenv information
# https://github.com/pyenv/pyenv#choosing-the-python-version # https://github.com/pyenv/pyenv#choosing-the-python-version
set_default POWERLEVEL9K_PYENV_PROMPT_ALWAYS_SHOW false
prompt_pyenv() { prompt_pyenv() {
if [[ -n "$PYENV_VERSION" ]]; then if [[ -n "$PYENV_VERSION" ]]; then
"$1_prompt_segment" "$0" "$2" "blue" "$DEFAULT_COLOR" "$PYENV_VERSION" 'PYTHON_ICON' "$1_prompt_segment" "$0" "$2" "blue" "$DEFAULT_COLOR" "$PYENV_VERSION" 'PYTHON_ICON'
elif [ $commands[pyenv] ]; then
local pyenv_version_name="$(pyenv version-name)"
local pyenv_global="system"
local pyenv_root="$(pyenv root)"
if [[ -f "${pyenv_root}/version" ]]; then
pyenv_global="$(pyenv version-file-read ${pyenv_root}/version)"
fi
if [[ "${pyenv_version_name}" != "${pyenv_global}" || "${POWERLEVEL9K_PYENV_PROMPT_ALWAYS_SHOW}" == "true" ]]; then
"$1_prompt_segment" "$0" "$2" "blue" "$DEFAULT_COLOR" "$pyenv_version_name" 'PYTHON_ICON'
fi
fi fi
} }
@@ -1779,10 +1816,16 @@ powerlevel9k_preexec() {
set_default POWERLEVEL9K_PROMPT_ADD_NEWLINE false set_default POWERLEVEL9K_PROMPT_ADD_NEWLINE false
powerlevel9k_prepare_prompts() { powerlevel9k_prepare_prompts() {
local RETVAL RPROMPT_PREFIX RPROMPT_SUFFIX # Return values. These need to be global, because
# they are used in prompt_status. Also, we need
# to get the return value of the last command at
# very first in this function. Do not move the
# lines down, otherwise the last command is not
# what you expected it to be.
RETVAL=$? RETVAL=$?
RETVALS=( "$pipestatus[@]" ) RETVALS=( "$pipestatus[@]" )
local RPROMPT_SUFFIX RPROMPT_PREFIX
_P9K_COMMAND_DURATION=$((EPOCHREALTIME - _P9K_TIMER_START)) _P9K_COMMAND_DURATION=$((EPOCHREALTIME - _P9K_TIMER_START))
# Reset start time # Reset start time
@@ -1868,7 +1911,7 @@ prompt_powerlevel9k_setup() {
fi fi
fi fi
defined POWERLEVEL9K_LEFT_PROMPT_ELEMENTS || POWERLEVEL9K_LEFT_PROMPT_ELEMENTS=(context dir rbenv vcs) defined POWERLEVEL9K_LEFT_PROMPT_ELEMENTS || POWERLEVEL9K_LEFT_PROMPT_ELEMENTS=(context dir vcs)
defined POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS || POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS=(status root_indicator background_jobs history time) defined POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS || POWERLEVEL9K_RIGHT_PROMPT_ELEMENTS=(status root_indicator background_jobs history time)
# Display a warning if deprecated segments are in use. # Display a warning if deprecated segments are in use.

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1 @@
powerlevel9k.zsh-theme

View File

@@ -0,0 +1,46 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at kate.ward@forestent.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version]
[homepage]: http://contributor-covenant.org
[version]: http://contributor-covenant.org/version/1/4/

View File

@@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -0,0 +1,434 @@
# shUnit2
shUnit2 is a [xUnit](http://en.wikipedia.org/wiki/XUnit) unit test framework for Bourne based shell scripts, and it is designed to work in a similar manner to [JUnit](http://www.junit.org), [PyUnit](http://pyunit.sourceforge.net), etc.. If you have ever had the desire to write a unit test for a shell script, shUnit2 can do the job.
[![Travis CI](https://img.shields.io/travis/kward/shunit2.svg)](https://travis-ci.org/kward/shunit2)
## Table of Contents
* [Introduction](#introduction)
* [Credits / Contributors](#credits-contributors)
* [Feedback](#feedback)
* [Quickstart](#quickstart)
* [Function Reference](#function-reference)
* [General Info](#general-info)
* [Asserts](#asserts)
* [Failures](#failures)
* [Setup/Teardown](#setup-teardown)
* [Skipping](#skipping)
* [Suites](#suites)
* [Advanced Usage](#advanced-usage)
* [Some constants you can use](#some-constants-you-can-use)
* [Error Handling](#error-handling)
* [Including Line Numbers in Asserts (Macros)](#including-line-numbers-in-asserts-macros)
* [Test Skipping](#test-skipping)
* [Appendix](#appendix)
* [Getting help](#getting-help)
* [Zsh](#zsh)
---
## <a name="introduction"></a> Introduction
shUnit2 was originally developed to provide a consistent testing solution for [log4sh][log4sh], a shell based logging framework similar to [log4j](http://logging.apache.org). During the development of that product, a repeated problem of having things work just fine under one shell (`/bin/bash` on Linux to be specific), and then not working under another shell (`/bin/sh` on Solaris) kept coming up. Although several simple tests were run, they were not adequate and did not catch some corner cases. The decision was finally made to write a proper unit test framework after multiple brown-bag releases were made. _Research was done to look for an existing product that met the testing requirements, but no adequate product was found._
Tested Operating Systems (varies over time)
* Cygwin
* FreeBSD (user supported)
* Linux (Gentoo, Ubuntu)
* Mac OS X
* Solaris 8, 9, 10 (inc. OpenSolaris)
Tested Shells
* Bourne Shell (__sh__)
* BASH - GNU Bourne Again SHell (__bash__)
* DASH (__dash__)
* Korn Shell (__ksh__)
* pdksh - Public Domain Korn Shell (__pdksh__)
* zsh - Zsh (__zsh__) (since 2.1.2) _please see the Zsh shell errata for more information_
See the appropriate Release Notes for this release (`doc/RELEASE_NOTES-X.X.X.txt`) for the list of actual versions tested.
### <a name="credits-contributors"></a> Credits / Contributors
A list of contributors to shUnit2 can be found in `doc/contributors.md`. Many thanks go out to all those who have contributed to make this a better tool.
shUnit2 is the original product of many hours of work by Kate Ward, the primary author of the code. For related software, check out https://github.com/kward.
### <a name="feedback"></a> Feedback
Feedback is most certainly welcome for this document. Send your additions, comments and criticisms to the shunit2-users@google.com mailing list.
---
## <a name="quickstart"></a> Quickstart
This section will give a very quick start to running unit tests with shUnit2. More information is located in later sections.
Here is a quick sample script to show how easy it is to write a unit test in shell. _Note: the script as it stands expects that you are running it from the "examples" directory._
```sh
#! /bin/sh
# file: examples/equality_test.sh
testEquality() {
assertEquals 1 1
}
# Load shUnit2.
. ./shunit2
```
Running the unit test should give results similar to the following.
```console
$ cd examples
$ ./equality_test.sh
testEquality
Ran 1 test.
OK
```
W00t! You've just run your first successful unit test. So, what just happened? Quite a bit really, and it all happened simply by sourcing the `shunit2` library. The basic functionality for the script above goes like this:
* When shUnit2 is sourced, it will walk through any functions defined whose name starts with the string `test`, and add those to an internal list of tests to execute. Once a list of test functions to be run has been determined, shunit2 will go to work.
* Before any tests are executed, shUnit2 again looks for a function, this time one named `oneTimeSetUp()`. If it exists, it will be run. This function is normally used to setup the environment for all tests to be run. Things like creating directories for output or setting environment variables are good to place here. Just so you know, you can also declare a corresponding function named `oneTimeTearDown()` function that does the same thing, but once all the tests have been completed. It is good for removing temporary directories, etc.
* shUnit2 is now ready to run tests. Before doing so though, it again looks for another function that might be declared, one named `setUp()`. If the function exists, it will be run before each test. It is good for resetting the environment so that each test starts with a clean slate. **At this stage, the first test is finally run.** The success of the test is recorded for a report that will be generated later. After the test is run, shUnit2 looks for a final function that might be declared, one named `tearDown()`. If it exists, it will be run after each test. It is a good place for cleaning up after each test, maybe doing things like removing files that were created, or removing directories. This set of steps, `setUp() > test() > tearDown()`, is repeated for all of the available tests.
* Once all the work is done, shUnit2 will generate the nice report you saw above. A summary of all the successes and failures will be given so that you know how well your code is doing.
We should now try adding a test that fails. Change your unit test to look like this.
```sh
#! /bin/sh
# file: examples/party_test.sh
testEquality() {
assertEquals 1 1
}
testPartyLikeItIs1999() {
year=`date '+%Y'`
assertEquals "It's not 1999 :-(" '1999' "${year}"
}
# Load shUnit2.
. ./shunit2
```
So, what did you get? I guess it told you that this isn't 1999. Bummer, eh? Hopefully, you noticed a couple of things that were different about the second test. First, we added an optional message that the user will see if the assert fails. Second, we did comparisons of strings instead of integers as in the first test. It doesn't matter whether you are testing for equality of strings or integers. Both work equally well with shUnit2.
Hopefully, this is enough to get you started with unit testing. If you want a ton more examples, take a look at the tests provided with [log4sh][log4sh] or [shFlags][shflags]. Both provide excellent examples of more advanced usage. shUnit2 was after all written to meet the unit testing need that [log4sh][log4sh] had.
---
## <a name="function-reference"></a> Function Reference
### <a name="general-info"></a> General Info
Any string values passed should be properly quoted -- they should must be surrounded by single-quote (`'`) or double-quote (`"`) characters -- so that the shell will properly parse them.
### <a name="asserts"></a> Asserts
`assertEquals [message] expected actual`
Asserts that _expected_ and _actual_ are equal to one another. The _expected_ and _actual_ values can be either strings or integer values as both will be treated as strings. The _message_ is optional, and must be quoted.
`assertNotEquals [message] unexpected actual`
Asserts that _unexpected_ and _actual_ are not equal to one another. The _unexpected_ and _actual_ values can be either strings or integer values as both will be treaded as strings. The _message_ is optional, and must be quoted.
`assertSame [message] expected actual`
This function is functionally equivalent to `assertEquals`.
`assertNotSame [message] unexpected actual`
This function is functionally equivalent to `assertNotEquals`.
`assertNull [message] value`
Asserts that _value_ is _null_, or in shell terms, a zero-length string. The _value_ must be a string as an integer value does not translate into a zero-length string. The _message_ is optional, and must be quoted.
`assertNotNull [message] value`
Asserts that _value_ is _not null_, or in shell terms, a non-empty string. The _value_ may be a string or an integer as the later will be parsed as a non-empty string value. The _message_ is optional, and must be quoted.
`assertTrue [message] condition`
Asserts that a given shell test _condition_ is _true_. The condition can be as simple as a shell _true_ value (the value `0` -- equivalent to `${SHUNIT_TRUE}`), or a more sophisticated shell conditional expression. The _message_ is optional, and must be quoted.
A sophisticated shell conditional expression is equivalent to what the __if__ or __while__ shell built-ins would use (more specifically, what the __test__ command would use). Testing for example whether some value is greater than another value can be done this way.
`assertTrue "[ 34 -gt 23 ]"`
Testing for the ability to read a file can also be done. This particular test will fail.
`assertTrue 'test failed' "[ -r /some/non-existant/file' ]"`
As the expressions are standard shell __test__ expressions, it is possible to string multiple expressions together with `-a` and `-o` in the standard fashion. This test will succeed as the entire expression evaluates to _true_.
`assertTrue 'test failed' '[ 1 -eq 1 -a 2 -eq 2 ]'`
_One word of warning: be very careful with your quoting as shell is not the most forgiving of bad quoting, and things will fail in strange ways._
`assertFalse [message] condition`
Asserts that a given shell test _condition_ is _false_. The condition can be as simple as a shell _false_ value (the value `1` -- equivalent to `${SHUNIT_FALSE}`), or a more sophisticated shell conditional expression. The _message_ is optional, and must be quoted.
_For examples of more sophisticated expressions, see `assertTrue`._
### <a name="failures"></a> Failures
Just to clarify, failures __do not__ test the various arguments against one another. Failures simply fail, optionally with a message, and that is all they do. If you need to test arguments against one another, use asserts.
If all failures do is fail, why might one use them? There are times when you may have some very complicated logic that you need to test, and the simple asserts provided are simply not adequate. You can do your own validation of the code, use an `assertTrue ${SHUNIT_TRUE}` if your own tests succeeded, and use a failure to record a failure.
`fail [message]`
Fails the test immediately. The _message_ is optional, and must be quoted.
`failNotEquals [message] unexpected actual`
Fails the test immediately, reporting that the _unexpected_ and _actual_ values are not equal to one another. The _message_ is optional, and must be quoted.
_Note: no actual comparison of unexpected and actual is done._
`failSame [message] expected actual`
Fails the test immediately, reporting that the _expected_ and _actual_ values are the same. The _message_ is optional, and must be quoted.
_Note: no actual comparison of expected and actual is done._
`failNotSame [message] expected actual`
Fails the test immediately, reporting that the _expected_ and _actual_ values are not the same. The _message_ is optional, and must be quoted.
_Note: no actual comparison of expected and actual is done._
### <a name="setup-teardown"></a> Setup/Teardown
`oneTimeSetUp`
This function can be be optionally overridden by the user in their test suite.
If this function exists, it will be called once before any tests are run. It is useful to prepare a common environment for all tests.
`oneTimeTearDown`
This function can be be optionally overridden by the user in their test suite.
If this function exists, it will be called once after all tests are completed. It is useful to clean up the environment after all tests.
`setUp`
This function can be be optionally overridden by the user in their test suite.
If this function exists, it will be called before each test is run. It is useful to reset the environment before each test.
`tearDown`
This function can be be optionally overridden by the user in their test suite.
If this function exists, it will be called after each test completes. It is useful to clean up the environment after each test.
### <a name="skipping"></a> Skipping
`startSkipping`
This function forces the remaining _assert_ and _fail_ functions to be "skipped", i.e. they will have no effect. Each function skipped will be recorded so that the total of asserts and fails will not be altered.
`endSkipping`
This function returns calls to the _assert_ and _fail_ functions to their default behavior, i.e. they will be called.
`isSkipping`
This function returns the current state of skipping. It can be compared against `${SHUNIT_TRUE}` or `${SHUNIT_FALSE}` if desired.
### <a name="suites"></a> Suites
The default behavior of shUnit2 is that all tests will be found dynamically. If you have a specific set of tests you want to run, or you don't want to use the standard naming scheme of prefixing your tests with `test`, these functions are for you. Most users will never use them though.
`suite`
This function can be optionally overridden by the user in their test suite.
If this function exists, it will be called when `shunit2` is sourced. If it does not exist, shUnit2 will search the parent script for all functions beginning with the word `test`, and they will be added dynamically to the test suite.
`suite_addTest name`
This function adds a function named _name_ to the list of tests scheduled for execution as part of this test suite. This function should only be called from within the `suite()` function.
---
## <a name="advanced-usage"></a> Advanced Usage
### <a name="some-constants-you-can-use"></a> Some constants you can use
There are several constants provided by shUnit2 as variables that might be of use to you.
*Predefined*
| Constant | Value |
| --------------- | ----- |
| SHUNIT\_TRUE | Standard shell `true` value (the integer value 0). |
| SHUNIT\_FALSE | Standard shell `false` value (the integer value 1). |
| SHUNIT\_ERROR | The integer value 2. |
| SHUNIT\_TMPDIR | Path to temporary directory that will be automatically cleaned up upon exit of shUnit2. |
| SHUNIT\_VERSION | The version of shUnit2 you are running. |
*User defined*
| Constant | Value |
| ----------------- | ----- |
| SHUNIT\_CMD\_EXPR | Override which `expr` command is used. By default `expr` is used, except on BSD systems where `gexpr` is used. |
| SHUNIT\_COLOR | Enable colorized output. Options are 'auto', 'always', or 'never', with 'auto' being the default. |
| SHUNIT\_PARENT | The filename of the shell script containing the tests. This is needed specifically for Zsh support. |
| SHUNIT\_TEST\_PREFIX | Define this variable to add a prefix in front of each test name that is output in the test report. |
### <a name="error-handling"></a> Error handling
The constants values `SHUNIT_TRUE`, `SHUNIT_FALSE`, and `SHUNIT_ERROR` are returned from nearly every function to indicate the success or failure of the function. Additionally the variable `flags_error` is filled with a detailed error message if any function returns with a `SHUNIT_ERROR` value.
### <a name="including-line-numbers-in-asserts-macros"></a> Including Line Numbers in Asserts (Macros)
If you include lots of assert statements in an individual test function, it can become difficult to determine exactly which assert was thrown unless your messages are unique. To help somewhat, line numbers can be included in the assert messages. To enable this, a special shell "macro" must be used rather than the standard assert calls. _Shell doesn't actually have macros; the name is used here as the operation is similar to a standard macro._
For example, to include line numbers for a `assertEquals()` function call, replace the `assertEquals()` with `${_ASSERT_EQUALS_}`.
_**Example** -- Asserts with and without line numbers_
```sh
#! /bin/sh
# file: examples/lineno_test.sh
testLineNo() {
# This assert will have line numbers included (e.g. "ASSERT:[123] ...").
echo "ae: ${_ASSERT_EQUALS_}"
${_ASSERT_EQUALS_} 'not equal' 1 2
# This assert will not have line numbers included (e.g. "ASSERT: ...").
assertEquals 'not equal' 1 2
}
# Load shUnit2.
. ./shunit2
```
Notes:
1. Due to how shell parses command-line arguments, all strings used with macros should be quoted twice. Namely, single-quotes must be converted to single-double-quotes, and vice-versa. If the string being passed is absolutely for sure not empty, the extra quoting is not necessary.<br/><br/>Normal `assertEquals` call.<br/>`assertEquals 'some message' 'x' ''`<br/><br/>Macro `_ASSERT_EQUALS_` call. Note the extra quoting around the _message_ and the _null_ value.<br/>`_ASSERT_EQUALS_ '"some message"' 'x' '""'`
1. Line numbers are not supported in all shells. If a shell does not support them, no errors will be thrown. Supported shells include: __bash__ (>=3.0), __ksh__, __pdksh__, and __zsh__.
### <a name="test-skipping"></a> Test Skipping
There are times where the test code you have written is just not applicable to the system you are running on. This section describes how to skip these tests but maintain the total test count.
Probably the easiest example would be shell code that is meant to run under the __bash__ shell, but the unit test is running under the Bourne shell. There are things that just won't work. The following test code demonstrates two sample functions, one that will be run under any shell, and the another that will run only under the __bash__ shell.
_**Example** -- math include_
```sh
# file: examples/math.inc.
add_generic() {
num_a=$1
num_b=$2
expr $1 + $2
}
add_bash() {
num_a=$1
num_b=$2
echo $(($1 + $2))
}
```
And here is a corresponding unit test that correctly skips the `add_bash()` function when the unit test is not running under the __bash__ shell.
_**Example** -- math unit test_
```sh
#! /bin/sh
# file: examples/math_test.sh
testAdding() {
result=`add_generic 1 2`
assertEquals \
"the result of '${result}' was wrong" \
3 "${result}"
# Disable non-generic tests.
[ -z "${BASH_VERSION:-}" ] && startSkipping
result=`add_bash 1 2`
assertEquals \
"the result of '${result}' was wrong" \
3 "${result}"
}
oneTimeSetUp() {
# Load include to test.
. ./math.inc
}
# Load and run shUnit2.
. ./shunit2
```
Running the above test under the __bash__ shell will result in the following output.
```console
$ /bin/bash math_test.sh
testAdding
Ran 1 test.
OK
```
But, running the test under any other Unix shell will result in the following output.
```console
$ /bin/ksh math_test.sh
testAdding
Ran 1 test.
OK (skipped=1)
```
As you can see, the total number of tests has not changed, but the report indicates that some tests were skipped.
Skipping can be controlled with the following functions: `startSkipping()`, `endSkipping()`, and `isSkipping()`. Once skipping is enabled, it will remain enabled until the end of the current test function call, after which skipping is disabled.
---
## <a name="appendix"></a> Appendix
### <a name="getting-help"></a> Getting Help
For help, please send requests to either the shunit2-users@googlegroups.com mailing list (archives available on the web at http://groups.google.com/group/shunit2-users) or directly to Kate Ward <kate dot ward at forestent dot com>.
### <a name="zsh"></a> Zsh
For compatibility with Zsh, there is one requirement that must be met -- the `shwordsplit` option must be set. There are three ways to accomplish this.
1. In the unit-test script, add the following shell code snippet before sourcing the `shunit2` library.
```sh
setopt shwordsplit
```
1. When invoking __zsh__ from either the command-line or as a script with `#!`, add the `-y` parameter.
```sh
#! /bin/zsh -y
```
1. When invoking __zsh__ from the command-line, add `-o shwordsplit --` as parameters before the script name.
```console
$ zsh -o shwordsplit -- some_script
```
[log4sh]: https://github.com/kward/log4sh
[shflags]: https://github.com/kward/shflags

View File

@@ -1,14 +1,53 @@
Changes in shUnit2 2.1.X # shUnit2 2.1.x Changes
========================
Changes with 2.1.7 ## Changes with 2.1.8
------------------
Updated the LGPL license from http://www.gnu.org/licenses/old-licenses/lgpl-2.1.txt. ### Enhancements
Issue #29. Add support for user defined prefix for test names. A prefix can be added by defining the `SHUNIT_TEST_PREFIX` variable.
Issue #78. Added an example for using suite tests.
Run continuous integration additionally against Ubuntu Trusty.
### Bug fixes
Issue #84. Treat syntax errors in functions as test failures.
Issue #77. Fail tests when the environment functions (e.g. `setup()` or `tearDown()`) fail.
## Changes with 2.1.7
### Bug fixes
Issue #69. shUnit2 should not exit with 0 when it has (syntax) errors.
### Enhancements
Issue #54. Shell commands prefixed with '\' so that they can be stubbed in
tests.
Issue #68. Ran all code through [ShellCheck](http://www.shellcheck.net/).
Issue #60. Continuous integration tests now run with
[Travis CI](https://travis-ci.org/kward/shunit2).
Issue #56. Added color support. Color is enabled automatically when supported,
but can be disabled by defining the SHUNIT_COLOR environment variable before
sourcing shunit2. Accepted values are `always`, `auto` (the default), and
`none`.
Issue #35. Add colored output.
### Other
Moved code to GitHub (https://github.com/kward/shunit2), and restructured to
be more GitHub like.
Changed to the Apache 2.0 license.
Changes with 2.1.6 ## Changes with 2.1.6
------------------
Removed all references to the DocBook documentation. Removed all references to the DocBook documentation.
@@ -18,11 +57,11 @@ Fixed error message in fail() that stated wrong number of required arguments.
Updated lib/versions. Updated lib/versions.
Fixed bug in _shunit_mktempDir() where a failure occurred when the 'od' command Fixed bug in `_shunit_mktempDir()` where a failure occurred when the 'od'
was not present in /usr/bin. command was not present in `/usr/bin`.
Renamed shunit_tmpDir variable to SHUNIT_TMPDIR to closer match the standard Renamed `shunit_tmpDir` variable to `SHUNIT_TMPDIR` to closer match the standard
TMPDIR variable. `TMPDIR` variable.
Added support for calling shunit2 as an executable, in addition to the existing Added support for calling shunit2 as an executable, in addition to the existing
method of sourcing it in as a library. This allows users to keep tests working method of sourcing it in as a library. This allows users to keep tests working
@@ -32,56 +71,55 @@ distribution.
Issue #14: Improved handling of some strange chars (e.g. single and double Issue #14: Improved handling of some strange chars (e.g. single and double
quotes) in messages. quotes) in messages.
Issue# 27: Fixed error message for assertSame(). Issue# 27: Fixed error message for `assertSame()`.
Issue# 25: Added check and error message to user when phantom functions are Issue# 25: Added check and error message to user when phantom functions are
written to a partition mounted with noexec. written to a partition mounted with `noexec`.
Issue# 11: Added support for defining functions like 'function someFunction()'. Issue# 11: Added support for defining functions like `function someFunction()`.
Changes with 2.1.5 ## Changes with 2.1.5
------------------
Issue# 1: Fixed bug pointed out by R Bernstein in the trap code where certain Issue# 1: Fixed bug pointed out by R Bernstein in the trap code where certain
types of exit conditions did not generate the ending report. types of exit conditions did not generate the ending report.
Issue# 2: Added assertNotEquals() assert. Issue# 2: Added `assertNotEquals()` assert.
Issue# 3: Moved check for unset variables out of shUnit2 into the unit tests. Issue# 3: Moved check for unset variables out of shUnit2 into the unit tests.
Testing poorly written software blows up if this check is in, but it is only Testing poorly written software blows up if this check is in, but it is only
interesting for shUnit2 itself. Added shunit_test_output.sh unit test for this. interesting for shUnit2 itself. Added `shunit_test_output.sh` unit test for
Some shells still do not catch such errors properly (e.g. Bourne shell and BASH this. Some shells still do not catch such errors properly (e.g. Bourne shell and
2.x). BASH 2.x).
Added new custom assert in test_helpers to check for output to STDOUT, and none Added new custom assert in test_helpers to check for output to STDOUT, and none
to STDERR. to STDERR.
Replaced fatal message in the temp directory creation with a _shunit_fatal() Replaced fatal message in the temp directory creation with a `_shunit_fatal()`
function call. function call.
Fixed test_output unit test so it works now that the 'set -u' stuff was removed Fixed test_output unit test so it works now that the 'set -u' stuff was removed
for Issue# 3. for Issue# 3.
Flushed out the coding standards in the README.txt a bit more, and brought the Flushed out the coding standards in the `README.txt` a bit more, and brought the
shunit2 code up to par with the documented standards. shunit2 code up to par with the documented standards.
Issue# 4: Completely changed the reporting output to be a closer match for Issue# 4: Completely changed the reporting output to be a closer match for
JUnit and PyUnit. As a result, tests are counted separately from assertions. JUnit and PyUnit. As a result, tests are counted separately from assertions.
Provide public shunit_tmpDir variable that can be used by unit test scripts that Provide public `shunit_tmpDir` variable that can be used by unit test scripts
need automated and guaranteed cleanup. that need automated and guaranteed cleanup.
Issue# 7: Fixed duplicated printing of messages passed to asserts. Issue# 7: Fixed duplicated printing of messages passed to asserts.
Per code review, fixed wording of failSame() and failNotSame() messages. Per code review, fixed wording of `failSame()` and `failNotSame()` messages.
Replaced version_info.sh with versions library and made appropriate changes in Replaced `version_info.sh` with versions library and made appropriate changes in
other scripts to use it. other scripts to use it.
Added gen_test_results.sh to make releases easier. Added `gen_test_results.sh` to make releases easier.
Fixed bugs in shlib_relToAbsPath() in shlib. Fixed bugs in `shlib_relToAbsPath()` in shlib.
Converted DocBook documentation to reStructuredText for easier maintenance. The Converted DocBook documentation to reStructuredText for easier maintenance. The
DocBook documentation is now considered obsolete, and will be removed in a DocBook documentation is now considered obsolete, and will be removed in a
@@ -97,10 +135,9 @@ When an invalid number of arguments is passed to a function, the invalid number
is returned to the user so they are more aware of what the cause might be. is returned to the user so they are more aware of what the cause might be.
Changes with 2.1.4 ## Changes with 2.1.4
------------------
Removed the _shunit_functionExists() function as it was dead code. Removed the `_shunit_functionExists()` function as it was dead code.
Fixed zsh version number check in version_info. Fixed zsh version number check in version_info.
@@ -123,11 +160,10 @@ result.
Improved zsh version and option checks. Improved zsh version and option checks.
Renamed the __SHUNIT_VERSION variable to SHUNIT_VERSION. Renamed the `__SHUNIT_VERSION` variable to `SHUNIT_VERSION`.
Changes with 2.1.3 ## Changes with 2.1.3
------------------
Added some explicit variable defaults, even though the variables are set, as Added some explicit variable defaults, even though the variables are set, as
they sometimes behave strange when the script is canceled. they sometimes behave strange when the script is canceled.
@@ -141,29 +177,27 @@ considered failures, and do not affect the exit code.
Changed detection of STDERR output in unit tests. Changed detection of STDERR output in unit tests.
Changes with 2.1.2 ## Changes with 2.1.2
------------------
Unset additional variables that were missed. Unset additional variables that were missed.
Added checks and workarounds to improve zsh compatibility. Added checks and workarounds to improve zsh compatibility.
Added some argument count checks ``assertEquals()``, ``assertNull()``, and Added some argument count checks `assertEquals()`, `assertNull()`, and
``assertSame()`` `assertSame()`.
Changes with 2.1.1 ## Changes with 2.1.1
------------------
Fixed bug where ``fail()`` was not honoring skipping. Fixed bug where `fail()` was not honoring skipping.
Fixed problem with ``docs-docbook-prep`` target that prevented it from working. Fixed problem with `docs-docbook-prep` target that prevented it from working.
(Thanks to Bryan Larsen for pointing this out.) (Thanks to Bryan Larsen for pointing this out.)
Changed the test in ``assertFalse()`` so that any non-zero value registers as Changed the test in `assertFalse()` so that any non-zero value registers as
false. (Credits to Bryan Larsen) false. (Credits to Bryan Larsen)
Major fiddling to bring more in line with `JUnit <http://junit.org/>`. Asserts Major fiddling to bring more in line with [JUnit](http://junit.org/). Asserts
give better output when no message is given, and failures now just fail. give better output when no message is given, and failures now just fail.
It was pointed out that the simple 'failed' message for a failed assert was not It was pointed out that the simple 'failed' message for a failed assert was not
@@ -172,7 +206,7 @@ provide the user with an expected vs actual result. The code was revised
somewhat to bring closer into alignment with JUnit (v4.3.1 specifically) so somewhat to bring closer into alignment with JUnit (v4.3.1 specifically) so
that it feels more "normal". (Credits to Richard Jensen) that it feels more "normal". (Credits to Richard Jensen)
As part of the JUnit realignment, it was noticed that fail*() functions in As part of the JUnit realignment, it was noticed that `fail*()` functions in
JUnit don't actually do any comparisons themselves. They only generate a JUnit don't actually do any comparisons themselves. They only generate a
failure message. Updated the code to match. failure message. Updated the code to match.
@@ -193,29 +227,27 @@ Fixed the method of percent calculation for the report to get achieve better
accuracy. accuracy.
Changes with 2.1.0 (since 2.0.1) ## Changes with 2.1.0 (since 2.0.1)
--------------------------------
This release is a branch of the 2.0.1 release. This release is a branch of the 2.0.1 release.
Moving to `reStructured Text <http://docutils.sourceforge.net/rst.html>`_ for Moving to [reStructured Text](http://docutils.sourceforge.net/rst.html) for
the documentation. the documentation.
Fixed problem with ``fail()``. The failure message was not properly printed. Fixed problem with `fail()`. The failure message was not properly printed.
Fixed the ``Makefile`` so that the DocBook XML and XSLT files would be Fixed the `Makefile` so that the DocBook XML and XSLT files would be
downloaded before parsing can continue. downloaded before parsing can continue.
Renamed the internal ``__SHUNIT_TRUE`` and ``__SHUNIT_FALSE`` variables to Renamed the internal `__SHUNIT_TRUE` and `__SHUNIT_FALSE` variables to
``SHUNIT_TRUE`` and ``SHUNIT_FALSE`` so that unit tests can "use" them. `SHUNIT_TRUE` and `SHUNIT_FALSE` so that unit tests can "use" them.
Added support for test "skipping". If skipping is turned on with the Added support for test "skipping". If skipping is turned on with the
``startSkip()`` function, ``assert`` and ``fail`` functions will return `startSkip()` function, `assert` and `fail` functions will return immediately,
immediately, and the skip will be recorded. and the skip will be recorded.
The report output format was changed to include the percentage for each test The report output format was changed to include the percentage for each test
result, rather than just those successful. result, rather than just those successful.
.. $Revision$
.. vim:fileencoding=latin1:ft=text:spell:tw=80 .. vim:fileencoding=latin1:ft=text:spell:tw=80

Some files were not shown because too many files have changed in this diff Show More