spack package
- spack.spack_version_info = (0, 20, 3)
(major, minor, micro, dev release) tuple
Subpackages
- spack.bootstrap package
BootstrapEnvironment
BootstrapEnvironment.bin_dirs()
BootstrapEnvironment.environment_root()
BootstrapEnvironment.pythonpaths()
BootstrapEnvironment.spack_dev_requirements()
BootstrapEnvironment.spack_yaml()
BootstrapEnvironment.update_installations()
BootstrapEnvironment.update_syspath_and_environ()
BootstrapEnvironment.view_root()
all_core_root_specs()
ensure_bootstrap_configuration()
ensure_core_dependencies()
ensure_environment_dependencies()
ensure_patchelf_in_path_or_raise()
is_bootstrapping()
status_message()
- Submodules
- spack.bootstrap.config module
- spack.bootstrap.core module
Bootstrapper
BuildcacheBootstrapper
IS_WINDOWS
METADATA_YAML_FILENAME
SourceBootstrapper
all_core_root_specs()
bootstrapper()
bootstrapping_sources()
clingo_root_spec()
create_bootstrapper()
ensure_clingo_importable_or_raise()
ensure_core_dependencies()
ensure_executables_in_path_or_raise()
ensure_gpg_in_path_or_raise()
ensure_module_importable_or_raise()
ensure_patchelf_in_path_or_raise()
gnupg_root_spec()
patchelf_root_spec()
source_is_enabled_or_raise()
verify_patchelf()
- spack.bootstrap.environment module
BootstrapEnvironment
BootstrapEnvironment.bin_dirs()
BootstrapEnvironment.environment_root()
BootstrapEnvironment.pythonpaths()
BootstrapEnvironment.spack_dev_requirements()
BootstrapEnvironment.spack_yaml()
BootstrapEnvironment.update_installations()
BootstrapEnvironment.update_syspath_and_environ()
BootstrapEnvironment.view_root()
black_root_spec()
ensure_environment_dependencies()
flake8_root_spec()
isort_root_spec()
mypy_root_spec()
pytest_root_spec()
- spack.bootstrap.status module
- spack.build_systems package
- Submodules
- spack.build_systems.aspell_dict module
- spack.build_systems.autotools module
AutotoolsBuilder
AutotoolsBuilder.archive_files
AutotoolsBuilder.autoreconf()
AutotoolsBuilder.autoreconf_extra_args
AutotoolsBuilder.autoreconf_search_path_args
AutotoolsBuilder.build()
AutotoolsBuilder.build_directory
AutotoolsBuilder.build_system
AutotoolsBuilder.build_targets
AutotoolsBuilder.build_time_test_callbacks
AutotoolsBuilder.check()
AutotoolsBuilder.configure()
AutotoolsBuilder.configure_abs_path
AutotoolsBuilder.configure_args()
AutotoolsBuilder.configure_directory
AutotoolsBuilder.delete_configure_to_force_update()
AutotoolsBuilder.enable_or_disable()
AutotoolsBuilder.force_autoreconf
AutotoolsBuilder.install()
AutotoolsBuilder.install_libtool_archives
AutotoolsBuilder.install_targets
AutotoolsBuilder.install_time_test_callbacks
AutotoolsBuilder.installcheck()
AutotoolsBuilder.legacy_attributes
AutotoolsBuilder.legacy_methods
AutotoolsBuilder.patch_config_files
AutotoolsBuilder.patch_libtool
AutotoolsBuilder.phases
AutotoolsBuilder.remove_libtool_archives()
AutotoolsBuilder.run_after_callbacks
AutotoolsBuilder.run_before_callbacks
AutotoolsBuilder.set_configure_or_die()
AutotoolsBuilder.setup_build_environment()
AutotoolsBuilder.with_or_without()
AutotoolsPackage
- spack.build_systems.bundle module
- spack.build_systems.cached_cmake module
CachedCMakeBuilder
CachedCMakeBuilder.cache_name
CachedCMakeBuilder.cache_path
CachedCMakeBuilder.initconfig()
CachedCMakeBuilder.initconfig_compiler_entries()
CachedCMakeBuilder.initconfig_hardware_entries()
CachedCMakeBuilder.initconfig_mpi_entries()
CachedCMakeBuilder.initconfig_package_entries()
CachedCMakeBuilder.install_cmake_cache()
CachedCMakeBuilder.legacy_attributes
CachedCMakeBuilder.legacy_methods
CachedCMakeBuilder.phases
CachedCMakeBuilder.run_after_callbacks
CachedCMakeBuilder.std_cmake_args
CachedCMakeBuilder.std_initconfig_entries()
CachedCMakePackage
cmake_cache_option()
cmake_cache_path()
cmake_cache_string()
- spack.build_systems.cmake module
CMakeBuilder
CMakeBuilder.archive_files
CMakeBuilder.build()
CMakeBuilder.build_directory
CMakeBuilder.build_dirname
CMakeBuilder.build_system
CMakeBuilder.build_targets
CMakeBuilder.build_time_test_callbacks
CMakeBuilder.check()
CMakeBuilder.cmake()
CMakeBuilder.cmake_args()
CMakeBuilder.define()
CMakeBuilder.define_from_variant()
CMakeBuilder.generator
CMakeBuilder.install()
CMakeBuilder.install_targets
CMakeBuilder.legacy_attributes
CMakeBuilder.legacy_methods
CMakeBuilder.phases
CMakeBuilder.root_cmakelists_dir
CMakeBuilder.run_after_callbacks
CMakeBuilder.std_args()
CMakeBuilder.std_cmake_args
CMakePackage
generator()
- spack.build_systems.cuda module
- spack.build_systems.generic module
- spack.build_systems.gnu module
- spack.build_systems.intel module
IntelPackage
IntelPackage.auto_dispatch_options
IntelPackage.base_lib_dir
IntelPackage.blas_libs
IntelPackage.build_system_class
IntelPackage.component_bin_dir()
IntelPackage.component_include_dir()
IntelPackage.component_lib_dir()
IntelPackage.configure()
IntelPackage.configure_auto_dispatch()
IntelPackage.configure_rpath()
IntelPackage.file_to_source
IntelPackage.filter_compiler_wrappers()
IntelPackage.global_license_file
IntelPackage.headers
IntelPackage.install()
IntelPackage.intel64_int_suffix
IntelPackage.lapack_libs
IntelPackage.libs
IntelPackage.license_comment
IntelPackage.license_files
IntelPackage.license_required
IntelPackage.license_url
IntelPackage.license_vars
IntelPackage.modify_LLVMgold_rpath()
IntelPackage.mpi_compiler_wrappers
IntelPackage.mpi_setup_dependent_build_environment()
IntelPackage.normalize_path()
IntelPackage.normalize_suite_dir()
IntelPackage.openmp_libs
IntelPackage.pset_components
IntelPackage.run_after_callbacks
IntelPackage.run_before_callbacks
IntelPackage.scalapack_libs
IntelPackage.setup_dependent_build_environment()
IntelPackage.setup_dependent_package()
IntelPackage.setup_run_environment()
IntelPackage.tbb_headers
IntelPackage.tbb_libs
IntelPackage.uninstall_ism()
IntelPackage.validate_install()
IntelPackage.version_yearlike
IntelPackage.version_years
debug_print()
raise_lib_error()
- spack.build_systems.lua module
LuaBuilder
LuaBuilder.build_system
LuaBuilder.build_time_test_callbacks
LuaBuilder.generate_luarocks_config()
LuaBuilder.install()
LuaBuilder.install_time_test_callbacks
LuaBuilder.legacy_attributes
LuaBuilder.legacy_methods
LuaBuilder.luarocks_args()
LuaBuilder.phases
LuaBuilder.preprocess()
LuaBuilder.setup_build_environment()
LuaBuilder.unpack()
LuaPackage
- spack.build_systems.makefile module
MakefileBuilder
MakefileBuilder.build()
MakefileBuilder.build_directory
MakefileBuilder.build_system
MakefileBuilder.build_targets
MakefileBuilder.build_time_test_callbacks
MakefileBuilder.check()
MakefileBuilder.edit()
MakefileBuilder.install()
MakefileBuilder.install_targets
MakefileBuilder.install_time_test_callbacks
MakefileBuilder.installcheck()
MakefileBuilder.legacy_attributes
MakefileBuilder.legacy_methods
MakefileBuilder.phases
MakefileBuilder.run_after_callbacks
MakefilePackage
- spack.build_systems.maven module
- spack.build_systems.meson module
MesonBuilder
MesonBuilder.archive_files
MesonBuilder.build()
MesonBuilder.build_directory
MesonBuilder.build_dirname
MesonBuilder.build_system
MesonBuilder.build_targets
MesonBuilder.build_time_test_callbacks
MesonBuilder.check()
MesonBuilder.install()
MesonBuilder.install_targets
MesonBuilder.legacy_attributes
MesonBuilder.legacy_methods
MesonBuilder.meson()
MesonBuilder.meson_args()
MesonBuilder.phases
MesonBuilder.root_mesonlists_dir
MesonBuilder.run_after_callbacks
MesonBuilder.std_args()
MesonBuilder.std_meson_args
MesonPackage
- spack.build_systems.msbuild module
MSBuildBuilder
MSBuildBuilder.build()
MSBuildBuilder.build_directory
MSBuildBuilder.build_system
MSBuildBuilder.build_targets
MSBuildBuilder.define()
MSBuildBuilder.define_targets()
MSBuildBuilder.install()
MSBuildBuilder.install_targets
MSBuildBuilder.msbuild_args()
MSBuildBuilder.msbuild_install_args()
MSBuildBuilder.phases
MSBuildBuilder.std_msbuild_args
MSBuildBuilder.toolchain_version
MSBuildPackage
- spack.build_systems.nmake module
NMakeBuilder
NMakeBuilder.build()
NMakeBuilder.build_directory
NMakeBuilder.build_system
NMakeBuilder.build_targets
NMakeBuilder.define()
NMakeBuilder.ignore_quotes
NMakeBuilder.install()
NMakeBuilder.install_targets
NMakeBuilder.makefile_root
NMakeBuilder.nmake_args()
NMakeBuilder.nmake_install_args()
NMakeBuilder.nmakefile_name
NMakeBuilder.override_env()
NMakeBuilder.phases
NMakeBuilder.std_nmake_args
NMakePackage
- spack.build_systems.octave module
- spack.build_systems.oneapi module
IntelOneApiLibraryPackage
IntelOneApiPackage
IntelOneApiPackage.c
IntelOneApiPackage.component_dir
IntelOneApiPackage.component_prefix
IntelOneApiPackage.homepage
IntelOneApiPackage.install()
IntelOneApiPackage.install_component()
IntelOneApiPackage.redistribute_source
IntelOneApiPackage.setup_run_environment()
IntelOneApiPackage.symlink_dir()
IntelOneApiPackage.update_description()
IntelOneApiStaticLibraryList
- spack.build_systems.perl module
PerlBuilder
PerlBuilder.build()
PerlBuilder.build_system
PerlBuilder.build_time_test_callbacks
PerlBuilder.check()
PerlBuilder.configure()
PerlBuilder.configure_args()
PerlBuilder.fix_shebang()
PerlBuilder.install()
PerlBuilder.install_time_test_callbacks
PerlBuilder.legacy_attributes
PerlBuilder.legacy_methods
PerlBuilder.phases
PerlBuilder.run_after_callbacks
PerlPackage
- spack.build_systems.python module
PythonExtension
PythonPackage
PythonPackage.build_system_class
PythonPackage.get_external_python_for_prefix()
PythonPackage.headers
PythonPackage.homepage
PythonPackage.install_time_test_callbacks
PythonPackage.legacy_buildsystem
PythonPackage.libs
PythonPackage.list_url
PythonPackage.py_namespace
PythonPackage.pypi
PythonPackage.url
PythonPipBuilder
PythonPipBuilder.build_directory
PythonPipBuilder.build_system
PythonPipBuilder.build_time_test_callbacks
PythonPipBuilder.config_settings()
PythonPipBuilder.global_options()
PythonPipBuilder.install()
PythonPipBuilder.install_options()
PythonPipBuilder.install_time_test_callbacks
PythonPipBuilder.legacy_attributes
PythonPipBuilder.legacy_long_methods
PythonPipBuilder.legacy_methods
PythonPipBuilder.phases
PythonPipBuilder.run_after_callbacks
PythonPipBuilder.std_args()
- spack.build_systems.qmake module
QMakeBuilder
QMakeBuilder.build()
QMakeBuilder.build_directory
QMakeBuilder.build_system
QMakeBuilder.build_time_test_callbacks
QMakeBuilder.check()
QMakeBuilder.install()
QMakeBuilder.install_time_test_callbacks
QMakeBuilder.legacy_attributes
QMakeBuilder.legacy_methods
QMakeBuilder.phases
QMakeBuilder.qmake()
QMakeBuilder.qmake_args()
QMakeBuilder.run_after_callbacks
QMakePackage
- spack.build_systems.r module
- spack.build_systems.racket module
- spack.build_systems.rocm module
- spack.build_systems.ruby module
- spack.build_systems.scons module
SConsBuilder
SConsBuilder.build()
SConsBuilder.build_args()
SConsBuilder.build_system
SConsBuilder.build_test()
SConsBuilder.build_time_test_callbacks
SConsBuilder.install()
SConsBuilder.install_args()
SConsBuilder.install_time_test_callbacks
SConsBuilder.legacy_attributes
SConsBuilder.legacy_long_methods
SConsBuilder.legacy_methods
SConsBuilder.phases
SConsBuilder.run_after_callbacks
SConsPackage
- spack.build_systems.sip module
SIPBuilder
SIPBuilder.build()
SIPBuilder.build_args()
SIPBuilder.build_system
SIPBuilder.build_time_test_callbacks
SIPBuilder.configure()
SIPBuilder.configure_args()
SIPBuilder.configure_file()
SIPBuilder.extend_path_setup()
SIPBuilder.install()
SIPBuilder.install_args()
SIPBuilder.install_time_test_callbacks
SIPBuilder.legacy_attributes
SIPBuilder.legacy_methods
SIPBuilder.phases
SIPBuilder.run_after_callbacks
SIPPackage
- spack.build_systems.sourceforge module
- spack.build_systems.sourceware module
- spack.build_systems.waf module
WafBuilder
WafBuilder.build()
WafBuilder.build_args()
WafBuilder.build_directory
WafBuilder.build_system
WafBuilder.build_test()
WafBuilder.build_time_test_callbacks
WafBuilder.configure()
WafBuilder.configure_args()
WafBuilder.install()
WafBuilder.install_args()
WafBuilder.install_test()
WafBuilder.install_time_test_callbacks
WafBuilder.legacy_attributes
WafBuilder.legacy_methods
WafBuilder.phases
WafBuilder.python()
WafBuilder.run_after_callbacks
WafBuilder.waf()
WafPackage
- spack.build_systems.xorg module
- spack.cmd package
CommandNameError
PythonNameError
all_commands()
cmd_name()
disambiguate_spec()
disambiguate_spec_from_hashes()
display_specs()
display_specs_as_json()
ensure_single_spec_or_die()
extant_file()
filter_loaded_specs()
find_environment()
first_line()
get_command()
get_module()
gray_hash()
is_git_repo()
iter_groups()
matching_spec_from_env()
parse_specs()
print_how_many_pkgs()
python_name()
remove_options()
require_active_env()
require_cmd_name()
require_python_name()
spack_is_git_repo()
- Subpackages
- Submodules
- spack.cmd.add module
- spack.cmd.arch module
- spack.cmd.audit module
- spack.cmd.blame module
- spack.cmd.bootstrap module
- spack.cmd.build_env module
- spack.cmd.buildcache module
- spack.cmd.cd module
- spack.cmd.change module
- spack.cmd.checksum module
- spack.cmd.ci module
- spack.cmd.clean module
- spack.cmd.clone module
- spack.cmd.commands module
- spack.cmd.compiler module
- spack.cmd.compilers module
- spack.cmd.concretize module
- spack.cmd.config module
- spack.cmd.containerize module
- spack.cmd.create module
AutoreconfPackageTemplate
AutotoolsPackageTemplate
BazelPackageTemplate
BuildSystemGuesser
BundlePackageTemplate
CMakePackageTemplate
IntelPackageTemplate
LuaPackageTemplate
MakefilePackageTemplate
MavenPackageTemplate
MesonPackageTemplate
OctavePackageTemplate
PackageTemplate
PerlbuildPackageTemplate
PerlmakePackageTemplate
PythonPackageTemplate
QMakePackageTemplate
RPackageTemplate
RacketPackageTemplate
RubyPackageTemplate
SIPPackageTemplate
SconsPackageTemplate
WafPackageTemplate
create()
get_build_system()
get_name()
get_repository()
get_url()
get_versions()
setup_parser()
- spack.cmd.debug module
- spack.cmd.dependencies module
- spack.cmd.dependents module
- spack.cmd.deprecate module
- spack.cmd.dev_build module
- spack.cmd.develop module
- spack.cmd.diff module
- spack.cmd.docs module
- spack.cmd.edit module
- spack.cmd.env module
ViewAction
create_temp_env_directory()
env()
env_activate()
env_activate_setup_parser()
env_create()
env_create_setup_parser()
env_deactivate()
env_deactivate_setup_parser()
env_depfile()
env_depfile_setup_parser()
env_list()
env_list_setup_parser()
env_loads()
env_loads_setup_parser()
env_remove()
env_remove_setup_parser()
env_revert()
env_revert_setup_parser()
env_status()
env_status_setup_parser()
env_update()
env_update_setup_parser()
env_view()
env_view_setup_parser()
setup_parser()
subcommand_functions
subcommands
- spack.cmd.extensions module
- spack.cmd.external module
- spack.cmd.fetch module
- spack.cmd.find module
- spack.cmd.gc module
- spack.cmd.gpg module
- spack.cmd.graph module
- spack.cmd.help module
- spack.cmd.info module
- spack.cmd.install module
- spack.cmd.license module
- spack.cmd.list module
- spack.cmd.load module
- spack.cmd.location module
- spack.cmd.log_parse module
- spack.cmd.maintainers module
- spack.cmd.make_installer module
- spack.cmd.mark module
- spack.cmd.mirror module
all_specs_with_all_versions()
concrete_specs_from_cli_or_file()
concrete_specs_from_environment()
concrete_specs_from_user()
create_mirror_for_all_specs()
create_mirror_for_all_specs_inside_environment()
create_mirror_for_individual_specs()
extend_with_additional_versions()
extend_with_dependencies()
filter_externals()
mirror()
mirror_add()
mirror_create()
mirror_destroy()
mirror_list()
mirror_remove()
mirror_set_url()
not_excluded_fn()
process_mirror_stats()
setup_parser()
specs_from_text_file()
versions_per_spec()
- spack.cmd.module module
- spack.cmd.patch module
- spack.cmd.pkg module
- spack.cmd.providers module
- spack.cmd.pydoc module
- spack.cmd.python module
- spack.cmd.reindex module
- spack.cmd.remove module
- spack.cmd.repo module
- spack.cmd.resource module
- spack.cmd.restage module
- spack.cmd.solve module
- spack.cmd.spec module
- spack.cmd.stage module
- spack.cmd.style module
changed_files()
cwd_relative()
exclude_directories
grouper()
is_package()
missing_tools()
mypy_ignores
print_style_header()
print_tool_header()
print_tool_result()
rewrite_and_print_output()
run_black()
run_flake8()
run_isort()
run_mypy()
setup_parser()
style()
tool
tool_names
tools
validate_toolset()
- spack.cmd.tags module
- spack.cmd.test module
- spack.cmd.test_env module
- spack.cmd.tutorial module
- spack.cmd.undevelop module
- spack.cmd.uninstall module
- spack.cmd.unit_test module
- spack.cmd.unload module
- spack.cmd.url module
- spack.cmd.verify module
- spack.cmd.versions module
- spack.cmd.view module
- spack.compilers package
CacheReference
CompilerDuplicateError
CompilerID
CompilerSpecInsufficientlySpecificError
DetectVersionArgs
InvalidCompilerConfigurationError
NameVariation
NoCompilerForSpecError
NoCompilersError
UnknownCompilerError
add_compilers_to_config()
all_compiler_specs()
all_compiler_types()
all_compilers()
all_compilers_config()
all_os_classes()
arguments_to_detect_version_fn()
class_for_compiler_name()
compiler_config_files()
compiler_for_spec()
compiler_from_dict()
compiler_specs_for_arch()
compilers_for_arch()
compilers_for_spec()
detect_version()
find()
find_compilers()
find_new_compilers()
find_specs_by_arch()
get_compiler_config()
get_compiler_duplicates()
get_compilers()
is_mixed_toolchain()
make_compiler_list()
pkg_spec_for_compiler()
remove_compiler_from_config()
select_new_compilers()
supported()
supported_compilers()
- Submodules
- spack.compilers.aocc module
Aocc
Aocc.PrgEnv
Aocc.PrgEnv_compiler
Aocc.c11_flag
Aocc.c99_flag
Aocc.cc_names
Aocc.cc_pic_flag
Aocc.cflags
Aocc.cxx11_flag
Aocc.cxx14_flag
Aocc.cxx17_flag
Aocc.cxx_names
Aocc.cxx_pic_flag
Aocc.cxxflags
Aocc.debug_flags
Aocc.extract_version_from_output()
Aocc.f77_names
Aocc.f77_pic_flag
Aocc.f77_version()
Aocc.fc_names
Aocc.fc_pic_flag
Aocc.fc_version()
Aocc.fflags
Aocc.link_paths
Aocc.openmp_flag
Aocc.opt_flags
Aocc.required_libs
Aocc.stdcxx_libs
Aocc.verbose_flag
Aocc.version_argument
- spack.compilers.apple_clang module
- spack.compilers.arm module
Arm
Arm.c11_flag
Arm.c99_flag
Arm.cc_names
Arm.cc_pic_flag
Arm.cxx11_flag
Arm.cxx14_flag
Arm.cxx17_flag
Arm.cxx_names
Arm.cxx_pic_flag
Arm.f77_names
Arm.f77_pic_flag
Arm.f77_version()
Arm.fc_names
Arm.fc_pic_flag
Arm.fc_version()
Arm.link_paths
Arm.openmp_flag
Arm.opt_flags
Arm.required_libs
Arm.verbose_flag
Arm.version_argument
Arm.version_regex
- spack.compilers.cce module
Cce
Cce.PrgEnv
Cce.PrgEnv_compiler
Cce.c11_flag
Cce.c99_flag
Cce.cc_names
Cce.cc_pic_flag
Cce.cxx11_flag
Cce.cxx14_flag
Cce.cxx17_flag
Cce.cxx_names
Cce.cxx_pic_flag
Cce.debug_flags
Cce.f77_names
Cce.f77_pic_flag
Cce.fc_names
Cce.fc_pic_flag
Cce.is_clang_based
Cce.link_paths
Cce.openmp_flag
Cce.stdcxx_libs
Cce.suffixes
Cce.verbose_flag
Cce.version_argument
Cce.version_regex
- spack.compilers.clang module
Clang
Clang.c11_flag
Clang.c17_flag
Clang.c23_flag
Clang.c99_flag
Clang.cc_names
Clang.cc_pic_flag
Clang.cxx11_flag
Clang.cxx14_flag
Clang.cxx17_flag
Clang.cxx_names
Clang.cxx_pic_flag
Clang.debug_flags
Clang.extract_version_from_output()
Clang.f77_names
Clang.f77_pic_flag
Clang.f77_version()
Clang.fc_names
Clang.fc_pic_flag
Clang.fc_version()
Clang.link_paths
Clang.openmp_flag
Clang.opt_flags
Clang.required_libs
Clang.verbose_flag
Clang.version_argument
f77_mapping
fc_mapping
- spack.compilers.dpcpp module
- spack.compilers.fj module
Fj
Fj.c11_flag
Fj.c99_flag
Fj.cc_names
Fj.cc_pic_flag
Fj.cxx11_flag
Fj.cxx14_flag
Fj.cxx17_flag
Fj.cxx98_flag
Fj.cxx_names
Fj.cxx_pic_flag
Fj.debug_flags
Fj.f77_names
Fj.f77_pic_flag
Fj.fc_names
Fj.fc_pic_flag
Fj.link_paths
Fj.openmp_flag
Fj.opt_flags
Fj.required_libs
Fj.verbose_flag
Fj.version_argument
Fj.version_regex
- spack.compilers.gcc module
Gcc
Gcc.PrgEnv
Gcc.PrgEnv_compiler
Gcc.c11_flag
Gcc.c99_flag
Gcc.cc_names
Gcc.cc_pic_flag
Gcc.cxx11_flag
Gcc.cxx14_flag
Gcc.cxx17_flag
Gcc.cxx98_flag
Gcc.cxx_names
Gcc.cxx_pic_flag
Gcc.debug_flags
Gcc.default_version()
Gcc.f77_names
Gcc.f77_pic_flag
Gcc.f77_version()
Gcc.fc_names
Gcc.fc_pic_flag
Gcc.fc_version()
Gcc.link_paths
Gcc.openmp_flag
Gcc.opt_flags
Gcc.prefix
Gcc.required_libs
Gcc.stdcxx_libs
Gcc.suffixes
Gcc.verbose_flag
- spack.compilers.intel module
Intel
Intel.PrgEnv
Intel.PrgEnv_compiler
Intel.c11_flag
Intel.c99_flag
Intel.cc_names
Intel.cc_pic_flag
Intel.cxx11_flag
Intel.cxx14_flag
Intel.cxx_names
Intel.cxx_pic_flag
Intel.debug_flags
Intel.f77_names
Intel.f77_pic_flag
Intel.fc_names
Intel.fc_pic_flag
Intel.link_paths
Intel.openmp_flag
Intel.opt_flags
Intel.required_libs
Intel.stdcxx_libs
Intel.verbose_flag
Intel.version_argument
Intel.version_regex
- spack.compilers.msvc module
Msvc
Msvc.cc_names
Msvc.cl_version
Msvc.cxx_names
Msvc.f77_names
Msvc.f77_version()
Msvc.fc_names
Msvc.fc_version()
Msvc.ignore_version_errors
Msvc.link_paths
Msvc.msvc_version
Msvc.platform_toolset_ver
Msvc.setup_custom_environment()
Msvc.short_msvc_version
Msvc.version_argument
Msvc.version_regex
Msvc.vs_root
get_valid_fortran_pth()
- spack.compilers.nag module
Nag
Nag.cc_names
Nag.cxx11_flag
Nag.cxx_names
Nag.debug_flags
Nag.disable_new_dtags
Nag.enable_new_dtags
Nag.f77_names
Nag.f77_pic_flag
Nag.f77_rpath_arg
Nag.fc_names
Nag.fc_pic_flag
Nag.fc_rpath_arg
Nag.link_paths
Nag.linker_arg
Nag.openmp_flag
Nag.opt_flags
Nag.verbose_flag
Nag.version_argument
Nag.version_regex
- spack.compilers.nvhpc module
Nvhpc
Nvhpc.PrgEnv
Nvhpc.PrgEnv_compiler
Nvhpc.c11_flag
Nvhpc.c99_flag
Nvhpc.cc_names
Nvhpc.cc_pic_flag
Nvhpc.cxx11_flag
Nvhpc.cxx14_flag
Nvhpc.cxx17_flag
Nvhpc.cxx_names
Nvhpc.cxx_pic_flag
Nvhpc.debug_flags
Nvhpc.f77_names
Nvhpc.f77_pic_flag
Nvhpc.fc_names
Nvhpc.fc_pic_flag
Nvhpc.link_paths
Nvhpc.openmp_flag
Nvhpc.opt_flags
Nvhpc.required_libs
Nvhpc.stdcxx_libs
Nvhpc.verbose_flag
Nvhpc.version_argument
Nvhpc.version_regex
- spack.compilers.oneapi module
Oneapi
Oneapi.PrgEnv
Oneapi.PrgEnv_compiler
Oneapi.c11_flag
Oneapi.c99_flag
Oneapi.cc_names
Oneapi.cc_pic_flag
Oneapi.cxx11_flag
Oneapi.cxx14_flag
Oneapi.cxx17_flag
Oneapi.cxx20_flag
Oneapi.cxx_names
Oneapi.cxx_pic_flag
Oneapi.debug_flags
Oneapi.f77_names
Oneapi.f77_pic_flag
Oneapi.fc_names
Oneapi.fc_pic_flag
Oneapi.link_paths
Oneapi.openmp_flag
Oneapi.opt_flags
Oneapi.required_libs
Oneapi.setup_custom_environment()
Oneapi.stdcxx_libs
Oneapi.verbose_flag
Oneapi.version_argument
Oneapi.version_regex
- spack.compilers.pgi module
Pgi
Pgi.PrgEnv
Pgi.PrgEnv_compiler
Pgi.c11_flag
Pgi.c99_flag
Pgi.cc_names
Pgi.cc_pic_flag
Pgi.cxx11_flag
Pgi.cxx_names
Pgi.cxx_pic_flag
Pgi.debug_flags
Pgi.f77_names
Pgi.f77_pic_flag
Pgi.fc_names
Pgi.fc_pic_flag
Pgi.ignore_version_errors
Pgi.link_paths
Pgi.openmp_flag
Pgi.opt_flags
Pgi.required_libs
Pgi.stdcxx_libs
Pgi.verbose_flag
Pgi.version_argument
Pgi.version_regex
- spack.compilers.rocmcc module
Rocmcc
Rocmcc.PrgEnv
Rocmcc.PrgEnv_compiler
Rocmcc.c11_flag
Rocmcc.c99_flag
Rocmcc.cc_names
Rocmcc.cxx11_flag
Rocmcc.cxx14_flag
Rocmcc.cxx17_flag
Rocmcc.cxx_names
Rocmcc.extract_version_from_output()
Rocmcc.f77_names
Rocmcc.f77_version()
Rocmcc.fc_names
Rocmcc.fc_version()
Rocmcc.link_paths
Rocmcc.stdcxx_libs
- spack.compilers.xl module
Xl
Xl.c11_flag
Xl.c99_flag
Xl.cc_names
Xl.cc_pic_flag
Xl.cxx11_flag
Xl.cxx14_flag
Xl.cxx_names
Xl.cxx_pic_flag
Xl.debug_flags
Xl.f77_names
Xl.f77_pic_flag
Xl.f77_version()
Xl.fc_names
Xl.fc_pic_flag
Xl.fc_version()
Xl.fflags
Xl.link_paths
Xl.openmp_flag
Xl.opt_flags
Xl.verbose_flag
Xl.version_argument
Xl.version_regex
- spack.compilers.xl_r module
- spack.container package
- spack.detection package
DetectedPackage
by_executable()
by_library()
executable_prefix()
executables_in_path()
update_configuration()
- Submodules
- spack.detection.common module
DetectedPackage
WindowsCompilerExternalPaths
WindowsKitExternalPaths
WindowsKitExternalPaths.find_windows_driver_development_kit_paths()
WindowsKitExternalPaths.find_windows_kit_bin_paths()
WindowsKitExternalPaths.find_windows_kit_lib_paths()
WindowsKitExternalPaths.find_windows_kit_reg_installed_roots_paths()
WindowsKitExternalPaths.find_windows_kit_reg_sdk_paths()
WindowsKitExternalPaths.find_windows_kit_roots()
compute_windows_program_path_for_package()
compute_windows_user_path_for_package()
executable_prefix()
find_win32_additional_install_paths()
is_executable()
library_prefix()
path_to_dict()
update_configuration()
- spack.detection.path module
- spack.environment package
- spack.lock format
Environment
Environment.active
Environment.add()
Environment.add_default_view_to_env()
Environment.added_specs()
Environment.all_hashes()
Environment.all_matching_specs()
Environment.all_specs()
Environment.change_existing_spec()
Environment.check_views()
Environment.clear()
Environment.concrete_roots()
Environment.concretize()
Environment.concretize_and_add()
Environment.concretized_order
Environment.concretized_specs()
Environment.concretized_user_specs
Environment.config_scopes()
Environment.config_stage_dir
Environment.deconcretize()
Environment.default_view
Environment.delete_default_view()
Environment.destroy()
Environment.dev_specs
Environment.develop()
Environment.ensure_env_directory_exists()
Environment.env_file_config_scope()
Environment.env_file_config_scope_name()
Environment.env_subdir_path
Environment.get_by_hash()
Environment.get_one_by_hash()
Environment.included_config_scopes()
Environment.install_all()
Environment.install_specs()
Environment.internal
Environment.invalidate_repository_cache()
Environment.is_develop()
Environment.lock_path
Environment.log_path
Environment.manifest_path
Environment.manifest_uptodate_or_warn()
Environment.matching_spec()
Environment.name
Environment.regenerate_views()
Environment.remove()
Environment.removed_specs()
Environment.repo
Environment.repos_path
Environment.rm_default_view_from_env()
Environment.roots()
Environment.spec_lists
Environment.specs_by_hash
Environment.undevelop()
Environment.uninstalled_specs()
Environment.update_default_view()
Environment.update_environment_repository()
Environment.update_lockfile()
Environment.update_stale_references()
Environment.user_specs
Environment.view_path_default
Environment.write()
Environment.write_transaction()
SpackEnvironmentError
SpackEnvironmentViewError
activate()
active()
active_environment()
all_environment_names()
all_environments()
config_dict()
create()
create_in_dir()
deactivate()
default_manifest_yaml()
display_specs()
environment_dir_from_name()
exists()
initialize_environment_dir()
installed_specs()
is_env_dir()
is_latest_format()
manifest_file()
no_active_environment()
read()
root()
update_yaml()
- Submodules
- spack.environment.depfile module
- spack.environment.environment module
Environment
Environment.active
Environment.add()
Environment.add_default_view_to_env()
Environment.added_specs()
Environment.all_hashes()
Environment.all_matching_specs()
Environment.all_specs()
Environment.change_existing_spec()
Environment.check_views()
Environment.clear()
Environment.concrete_roots()
Environment.concretize()
Environment.concretize_and_add()
Environment.concretized_order
Environment.concretized_specs()
Environment.concretized_user_specs
Environment.config_scopes()
Environment.config_stage_dir
Environment.deconcretize()
Environment.default_view
Environment.delete_default_view()
Environment.destroy()
Environment.dev_specs
Environment.develop()
Environment.ensure_env_directory_exists()
Environment.env_file_config_scope()
Environment.env_file_config_scope_name()
Environment.env_subdir_path
Environment.get_by_hash()
Environment.get_one_by_hash()
Environment.included_config_scopes()
Environment.install_all()
Environment.install_specs()
Environment.internal
Environment.invalidate_repository_cache()
Environment.is_develop()
Environment.lock_path
Environment.log_path
Environment.manifest_path
Environment.manifest_uptodate_or_warn()
Environment.matching_spec()
Environment.name
Environment.new_installs
Environment.new_specs
Environment.regenerate_views()
Environment.remove()
Environment.removed_specs()
Environment.repo
Environment.repos_path
Environment.rm_default_view_from_env()
Environment.roots()
Environment.spec_lists
Environment.specs_by_hash
Environment.undevelop()
Environment.uninstalled_specs()
Environment.update_default_view()
Environment.update_environment_repository()
Environment.update_lockfile()
Environment.update_stale_references()
Environment.user_specs
Environment.view_path_default
Environment.views
Environment.write()
Environment.write_transaction()
EnvironmentManifestFile
EnvironmentManifestFile.absolutify_dev_paths()
EnvironmentManifestFile.add_definition()
EnvironmentManifestFile.add_develop_spec()
EnvironmentManifestFile.add_user_spec()
EnvironmentManifestFile.flush()
EnvironmentManifestFile.from_lockfile()
EnvironmentManifestFile.override_definition()
EnvironmentManifestFile.override_user_spec()
EnvironmentManifestFile.pristine_yaml_content
EnvironmentManifestFile.remove_default_view()
EnvironmentManifestFile.remove_definition()
EnvironmentManifestFile.remove_develop_spec()
EnvironmentManifestFile.remove_user_spec()
EnvironmentManifestFile.set_default_view()
EnvironmentManifestFile.yaml_content
SpackEnvironmentError
SpackEnvironmentViewError
ViewDescriptor
activate()
active()
active_environment()
all_environment_names()
all_environments()
check_disallowed_env_config_mods()
config_dict()
create()
create_in_dir()
deactivate()
deactivate_config_scope()
default_env_path
default_manifest_yaml()
display_specs()
ensure_env_root_path_exists()
env_root_path()
env_subdir_name
environment_dir_from_name()
exists()
initialize_environment_dir()
installed_specs()
is_env_dir()
is_latest_format()
lockfile_format_version
lockfile_name
make_repo_path()
manifest_file()
manifest_name
no_active_environment()
prepare_config_scope()
read()
root()
spack_env_var
update_yaml()
valid_env_name()
valid_environment_name_re
validate_env_name()
yaml_equivalent()
- spack.environment.shell module
- spack.hooks package
- Submodules
- spack.hooks.absolutify_elf_sonames module
- spack.hooks.licensing module
- spack.hooks.module_file_generation module
- spack.hooks.permissions_setters module
- spack.hooks.sbang module
- spack.hooks.write_install_manifest module
- spack.modules package
LmodModulefileWriter
TclModulefileWriter
disable_modules()
ensure_modules_are_enabled_or_warn()
- Submodules
- spack.modules.common module
BaseConfiguration
BaseConfiguration.context
BaseConfiguration.default_projections
BaseConfiguration.defaults
BaseConfiguration.env
BaseConfiguration.exclude_env_vars
BaseConfiguration.excluded
BaseConfiguration.hash
BaseConfiguration.literals_to_load
BaseConfiguration.projections
BaseConfiguration.specs_to_load
BaseConfiguration.specs_to_prereq
BaseConfiguration.suffixes
BaseConfiguration.template
BaseConfiguration.verbose
BaseContext
BaseContext.autoload
BaseContext.category
BaseContext.configure_options
BaseContext.context_properties
BaseContext.environment_modifications
BaseContext.has_manpath_modifications
BaseContext.long_description
BaseContext.modification_needs_formatting()
BaseContext.short_description
BaseContext.spec
BaseContext.timestamp
BaseContext.verbose
BaseFileLayout
BaseModuleFileWriter
DefaultTemplateNotDefined
ModuleIndexEntry
ModuleNotFoundError
ModulesError
ModulesTemplateNotFoundError
UpstreamModuleIndex
configuration()
dependencies()
disable_modules()
ensure_modules_are_enabled_or_warn()
generate_module_index()
get_module()
merge_config_rules()
read_module_index()
read_module_indices()
root_path()
update_dictionary_extending_lists()
- spack.modules.lmod module
- spack.modules.tcl module
- spack.operating_systems package
- spack.platforms package
Cray
Darwin
Linux
Platform
Platform.add_operating_system()
Platform.add_target()
Platform.back_end
Platform.back_os
Platform.binary_formats
Platform.default
Platform.default_os
Platform.detect()
Platform.front_end
Platform.front_os
Platform.operating_system()
Platform.priority
Platform.reserved_oss
Platform.reserved_targets
Platform.setup_platform_environment()
Platform.target()
Test
Windows
by_name()
host()
prevent_cray_detection()
reset()
- Submodules
- spack.platforms.cray module
- spack.platforms.darwin module
- spack.platforms.linux module
- spack.platforms.test module
- spack.platforms.windows module
- spack.reporters package
CDash
CDash.build_report()
CDash.build_report_for_package()
CDash.concretization_report()
CDash.extract_standalone_test_data()
CDash.finalize_report()
CDash.initialize_report()
CDash.report_build_name()
CDash.report_test_data()
CDash.success
CDash.test_report()
CDash.test_report_for_package()
CDash.test_skipped_report()
CDash.upload()
CDashConfiguration
JUnit
Reporter
- Submodules
- spack.reporters.base module
- spack.reporters.cdash module
CDash
CDash.buildIds
CDash.build_report()
CDash.build_report_for_package()
CDash.concretization_report()
CDash.extract_standalone_test_data()
CDash.finalize_report()
CDash.initialize_report()
CDash.report_build_name()
CDash.report_test_data()
CDash.success
CDash.test_report()
CDash.test_report_for_package()
CDash.test_skipped_report()
CDash.upload()
CDashConfiguration
build_stamp()
- spack.reporters.extract module
- spack.reporters.junit module
- spack.schema package
- Submodules
- spack.schema.bootstrap module
- spack.schema.buildcache_spec module
- spack.schema.cdash module
- spack.schema.ci module
- spack.schema.compilers module
- spack.schema.concretizer module
- spack.schema.config module
- spack.schema.container module
- spack.schema.cray_manifest module
- spack.schema.database_index module
- spack.schema.env module
- spack.schema.environment module
- spack.schema.gitlab_ci module
- spack.schema.merged module
- spack.schema.mirrors module
- spack.schema.modules module
- spack.schema.packages module
- spack.schema.projections module
- spack.schema.repos module
- spack.schema.spec module
- spack.schema.upstreams module
- spack.solver package
- Submodules
- spack.solver.asp module
AspFunction
AspFunctionBuilder
AspObject
DEFAULT_OUTPUT_CONFIGURATION
DeclaredVersion
ErrorHandler
InternalConcretizerError
OutputConfiguration
Provenance
PyclingoDriver
RequirementKind
RequirementRule
Result
Solver
SpackSolverSetup
SpackSolverSetup.add_concrete_versions_from_specs()
SpackSolverSetup.build_version_dict()
SpackSolverSetup.compiler_facts()
SpackSolverSetup.condition()
SpackSolverSetup.conflict_rules()
SpackSolverSetup.define_compiler_version_constraints()
SpackSolverSetup.define_concrete_input_specs()
SpackSolverSetup.define_target_constraints()
SpackSolverSetup.define_variant_values()
SpackSolverSetup.define_version_constraints()
SpackSolverSetup.define_virtual_constraints()
SpackSolverSetup.emit_facts_from_requirement_rules()
SpackSolverSetup.external_packages()
SpackSolverSetup.generate_possible_compilers()
SpackSolverSetup.impose()
SpackSolverSetup.literal_specs()
SpackSolverSetup.os_defaults()
SpackSolverSetup.package_compiler_defaults()
SpackSolverSetup.package_dependencies_rules()
SpackSolverSetup.package_provider_rules()
SpackSolverSetup.package_requirement_rules()
SpackSolverSetup.pkg_rules()
SpackSolverSetup.pkg_version_rules()
SpackSolverSetup.platform_defaults()
SpackSolverSetup.preferred_variants()
SpackSolverSetup.provider_defaults()
SpackSolverSetup.provider_requirements()
SpackSolverSetup.requirement_rules_from_package_py()
SpackSolverSetup.requirement_rules_from_packages_yaml()
SpackSolverSetup.setup()
SpackSolverSetup.spec_clauses()
SpackSolverSetup.spec_versions()
SpackSolverSetup.target_defaults()
SpackSolverSetup.target_preferences()
SpackSolverSetup.target_ranges()
SpackSolverSetup.virtual_preferences()
SpackSolverSetup.virtual_providers()
SpecBuilder
SpecBuilder.build_specs()
SpecBuilder.depends_on()
SpecBuilder.deprecated()
SpecBuilder.external_spec_selected()
SpecBuilder.hash()
SpecBuilder.ignored_attributes
SpecBuilder.no_flags()
SpecBuilder.node()
SpecBuilder.node_compiler_version()
SpecBuilder.node_flag()
SpecBuilder.node_flag_compiler_default()
SpecBuilder.node_flag_source()
SpecBuilder.node_os()
SpecBuilder.node_platform()
SpecBuilder.node_target()
SpecBuilder.reorder_flags()
SpecBuilder.sort_fn()
SpecBuilder.variant_value()
SpecBuilder.version()
UnsatisfiableSpecError
all_compilers_in_config()
ast_getter()
ast_sym()
ast_type()
bootstrap_clingo()
build_criteria_names()
build_priority_offset
check_packages_exist()
default_clingo_control()
extend_flag_list()
extract_args()
fixed_priority_offset
high_fixed_priority_offset
issequence()
listify()
packagize()
specify()
stringify()
- spack.util package
- Subpackages
- Submodules
- spack.util.classes module
- spack.util.compression module
BZipFileType
CompressedFileTypeInterface
FileTypeInterface
GZipFileType
LzmaFileType
TarFileType
ZCompressedFileType
ZipFleType
allowed_archive()
check_and_remove_ext()
check_extension()
compression_ext_from_compressed_archive()
decompressor_for()
decompressor_for_nix()
decompressor_for_win()
expand_contracted_extension()
expand_contracted_extension_in_path()
extension_from_file()
extension_from_path()
extension_from_stream()
is_bz2_supported()
is_gzip_supported()
is_lzma_supported()
reg_remove_ext()
strip_compression_extension()
strip_extension()
- spack.util.cpus module
- spack.util.crypto module
- spack.util.debug module
- spack.util.editor module
- spack.util.elf module
ELF_CONSTANTS
ELF_CONSTANTS.CLASS32
ELF_CONSTANTS.CLASS64
ELF_CONSTANTS.DATA2LSB
ELF_CONSTANTS.DATA2MSB
ELF_CONSTANTS.DT_NEEDED
ELF_CONSTANTS.DT_NULL
ELF_CONSTANTS.DT_RPATH
ELF_CONSTANTS.DT_RUNPATH
ELF_CONSTANTS.DT_SONAME
ELF_CONSTANTS.DT_STRTAB
ELF_CONSTANTS.ET_DYN
ELF_CONSTANTS.ET_EXEC
ELF_CONSTANTS.MAGIC
ELF_CONSTANTS.PT_DYNAMIC
ELF_CONSTANTS.PT_INTERP
ELF_CONSTANTS.PT_LOAD
ELF_CONSTANTS.SHT_STRTAB
ElfDynamicSectionUpdateFailed
ElfFile
ElfFile.byte_order
ElfFile.dt_needed_strs
ElfFile.dt_needed_strtab_offsets
ElfFile.dt_rpath_offset
ElfFile.dt_rpath_str
ElfFile.dt_soname_str
ElfFile.dt_soname_strtab_offset
ElfFile.elf_hdr
ElfFile.has_needed
ElfFile.has_pt_dynamic
ElfFile.has_pt_interp
ElfFile.has_rpath
ElfFile.has_soname
ElfFile.is_64_bit
ElfFile.is_little_endian
ElfFile.is_runpath
ElfFile.pt_dynamic_p_filesz
ElfFile.pt_dynamic_p_offset
ElfFile.pt_dynamic_strtab_offset
ElfFile.pt_interp_p_filesz
ElfFile.pt_interp_p_offset
ElfFile.pt_interp_str
ElfFile.pt_load
ElfFile.rpath_strtab_offset
ElfHeader
ElfParsingError
ProgramHeader32
ProgramHeader64
SectionHeader
find_strtab_size_at_offset()
get_rpaths()
parse_c_string()
parse_elf()
parse_header()
parse_program_headers()
parse_pt_dynamic()
parse_pt_interp()
read_exactly()
replace_rpath_in_place_or_raise()
retrieve_strtab()
vaddr_to_offset()
- spack.util.environment module
AppendFlagsEnv
AppendPath
DeprioritizeSystemPaths
EnvironmentModifications
EnvironmentModifications.append_flags()
EnvironmentModifications.append_path()
EnvironmentModifications.apply_modifications()
EnvironmentModifications.clear()
EnvironmentModifications.deprioritize_system_paths()
EnvironmentModifications.extend()
EnvironmentModifications.from_environment_diff()
EnvironmentModifications.from_sourcing_file()
EnvironmentModifications.group_by_name()
EnvironmentModifications.is_unset()
EnvironmentModifications.prepend_path()
EnvironmentModifications.prune_duplicate_paths()
EnvironmentModifications.remove_flags()
EnvironmentModifications.remove_path()
EnvironmentModifications.reversed()
EnvironmentModifications.set()
EnvironmentModifications.set_path()
EnvironmentModifications.shell_modifications()
EnvironmentModifications.unset()
NameModifier
NameValueModifier
PrependPath
PruneDuplicatePaths
RemoveFlagsEnv
RemovePath
SetEnv
SetPath
Trace
UnsetEnv
deprioritize_system_paths()
double_quote_escape()
dump_environment()
env_flag()
environment_after_sourcing_files()
filter_system_paths()
get_host_environment()
get_host_environment_metadata()
get_path()
inspect_path()
is_system_path()
path_put_first()
path_set()
pickle_environment()
preserve_environment()
prune_duplicate_paths()
sanitize()
set_env()
system_env_normalize()
validate()
- spack.util.executable module
- spack.util.file_cache module
- spack.util.file_permissions module
- spack.util.gcs module
- spack.util.git module
- spack.util.gpg module
- spack.util.hash module
- spack.util.ld_so_conf module
- spack.util.lock module
- spack.util.log_parse module
- spack.util.module_cmd module
- spack.util.naming module
- spack.util.package_hash module
- spack.util.parallel module
- spack.util.path module
- spack.util.pattern module
- spack.util.prefix module
- spack.util.s3 module
- spack.util.spack_json module
- spack.util.spack_yaml module
- spack.util.string module
- spack.util.timer module
- spack.util.url module
- spack.util.web module
FetchError
HTMLParseError
IncludeFragmentParser
LinkParser
NoNetworkConnectionError
SPACK_USER_AGENT
SpackWebError
base_curl_fetch_args()
check_curl_code()
fetch_url_text()
find_versions_of_archive()
get_header()
list_url()
parse_etag()
push_to_url()
read_from_url()
remove_url()
spider()
url_exists()
urlopen
- spack.util.windows_registry module
Submodules
spack.abi module
- class spack.abi.ABI[source]
Bases:
object
This class provides methods to test ABI compatibility between specs. The current implementation is rather rough and could be improved.
- architecture_compatible(target, constraint)[source]
Return true if architecture of target spec is ABI compatible to the architecture of constraint spec. If either the target or constraint specs have no architecture, target is also defined as architecture ABI compatible to constraint.
spack.audit module
Classes and functions to register audit checks for various parts of Spack and run them on-demand.
To register a new class of sanity checks (e.g. sanity checks for compilers.yaml), the first action required is to create a new AuditClass object:
audit_cfgcmp = AuditClass(
tag='CFG-COMPILER',
description='Sanity checks on compilers.yaml',
kwargs=()
)
This object is to be used as a decorator to register functions that will perform each a single check:
@audit_cfgcmp
def _search_duplicate_compilers(error_cls):
pass
These functions need to take as argument the keywords declared when
creating the decorator object plus an error_cls
argument at the
end, acting as a factory to create Error objects. It should return a
(possibly empty) list of errors.
Calls to each of these functions are triggered by the run
method of
the decorator object, that will forward the keyword arguments passed
as input.
- spack.audit.CALLBACKS = {'CFG-COMPILER': <spack.audit.AuditClass object>, 'CFG-PACKAGES': <spack.audit.AuditClass object>, 'GENERIC': <spack.audit.AuditClass object>, 'PKG-ATTRIBUTES': <spack.audit.AuditClass object>, 'PKG-DIRECTIVES': <spack.audit.AuditClass object>, 'PKG-HTTPS-DIRECTIVES': <spack.audit.AuditClass object>, 'PKG-PROPERTIES': <spack.audit.AuditClass object>}
Map an audit tag to a list of callables implementing checks
- class spack.audit.Error(summary, details)[source]
Bases:
object
Information on an error reported in a test.
- spack.audit.GROUPS = {'configs': ['CFG-COMPILER', 'CFG-PACKAGES'], 'generic': ['GENERIC'], 'packages': ['PKG-DIRECTIVES', 'PKG-ATTRIBUTES', 'PKG-PROPERTIES'], 'packages-https': ['PKG-HTTPS-DIRECTIVES']}
Map a group of checks to the list of related audit tags
- spack.audit.config_compiler = <spack.audit.AuditClass object>
Sanity checks on compilers.yaml
- spack.audit.config_packages = <spack.audit.AuditClass object>
Sanity checks on packages.yaml
- spack.audit.generic = <spack.audit.AuditClass object>
Generic checks relying on global state
- spack.audit.package_directives = <spack.audit.AuditClass object>
Sanity checks on package directives
spack.binary_distribution module
- class spack.binary_distribution.BinaryCacheIndex(cache_root)[source]
Bases:
object
The BinaryCacheIndex tracks what specs are available on (usually remote) binary caches.
This index is “best effort”, in the sense that whenever we don’t find what we’re looking for here, we will attempt to fetch it directly from configured mirrors anyway. Thus, it has the potential to speed things up, but cache misses shouldn’t break any spack functionality.
At the moment, everything in this class is initialized as lazily as possible, so that it avoids slowing anything in spack down until absolutely necessary.
TODO: What’s the cost if, e.g., we realize in the middle of a spack install that the cache is out of date, and we fetch directly? Does it mean we should have paid the price to update the cache earlier?
- clear()[source]
For testing purposes we need to be able to empty the cache and clear associated data structures.
- find_built_spec(spec, mirrors_to_check=None)[source]
Look in our cache for the built spec corresponding to
spec
.If the spec can be found among the configured binary mirrors, a list is returned that contains the concrete spec and the mirror url of each mirror where it can be found. Otherwise,
None
is returned.This method does not trigger reading anything from remote mirrors, but rather just checks if the concrete spec is found within the cache.
The cache can be updated by calling
update()
on the cache.- Parameters:
spec (spack.spec.Spec) – Concrete spec to find
mirrors_to_check – Optional mapping containing mirrors to check. If None, just assumes all configured mirrors.
- Returns:
- An list of objects containing the found specs and mirror url where
each can be found, e.g.:
[ { "spec": <concrete-spec>, "mirror_url": <mirror-root-url> } ]
- find_by_hash(find_hash, mirrors_to_check=None)[source]
Same as find_built_spec but uses the hash of a spec.
- Parameters:
find_hash (str) – hash of the spec to search
mirrors_to_check – Optional mapping containing mirrors to check. If None, just assumes all configured mirrors.
- regenerate_spec_cache(clear_existing=False)[source]
Populate the local cache of concrete specs (
_mirrors_for_spec
) from the locally cached buildcache index files. This is essentially a no-op if it has already been done, as we keep track of the index hashes for which we have already associated the built specs.
- update(with_cooldown=False)[source]
Make sure local cache of buildcache index files is up to date. If the same mirrors are configured as the last time this was called and none of the remote buildcache indices have changed, calling this method will only result in fetching the index hash from each mirror to confirm it is the same as what is stored locally. Otherwise, the buildcache
index.json
andindex.json.hash
files are retrieved from each configured mirror and stored locally (both in memory and on disk under_index_cache_root
).
- class spack.binary_distribution.BinaryCacheQuery(all_architectures)[source]
Bases:
object
Callable object to query if a spec is in a binary cache
- class spack.binary_distribution.BuildManifestVisitor[source]
Bases:
BaseDirectoryVisitor
Visitor that collects a list of files and symlinks that can be checked for need of relocation. It knows how to dedupe hardlinks and deal with symlinks to files and directories.
- before_visit_dir(root, rel_path, depth)[source]
Return True from this function to recurse into the directory at os.path.join(root, rel_path). Return False in order not to recurse further.
- before_visit_symlinked_dir(root, rel_path, depth)[source]
Return
True
to recurse into the symlinked directory andFalse
in order not to. Note:rel_path
is the path to the symlink itself. Following symlinked directories blindly can cause infinite recursion due to cycles.
- visit_file(root, rel_path, depth)[source]
Handle the non-symlink file at
os.path.join(root, rel_path)
- exception spack.binary_distribution.BuildcacheIndexError(message, long_message=None)[source]
Bases:
SpackError
Raised when a buildcache cannot be read for any reason
- class spack.binary_distribution.DefaultIndexFetcher(url, local_hash, urlopen=<function _urlopen.<locals>.dispatch_open>)[source]
Bases:
object
Fetcher for index.json, using separate index.json.hash as cache invalidation strategy
- class spack.binary_distribution.EtagIndexFetcher(url, etag, urlopen=<function _urlopen.<locals>.dispatch_open>)[source]
Bases:
object
Fetcher for index.json, using ETags headers as cache invalidation strategy
- exception spack.binary_distribution.FetchCacheError(errors)[source]
Bases:
Exception
Error thrown when fetching the cache failed, usually a composite error list.
- class spack.binary_distribution.FetchIndexResult(etag, hash, data, fresh)
Bases:
tuple
- data
Alias for field number 2
- etag
Alias for field number 0
- fresh
Alias for field number 3
- hash
Alias for field number 1
- exception spack.binary_distribution.ListMirrorSpecsError(message, long_message=None)[source]
Bases:
SpackError
Raised when unable to retrieve list of specs from the mirror
- exception spack.binary_distribution.NewLayoutException(msg)[source]
Bases:
SpackError
Raised if directory layout is different from buildcache.
- exception spack.binary_distribution.NoChecksumException(path, size, contents, algorithm, expected, computed)[source]
Bases:
SpackError
Raised if file fails checksum verification.
- exception spack.binary_distribution.NoGpgException(msg)[source]
Bases:
SpackError
Raised when gpg2 is not in PATH
- exception spack.binary_distribution.NoKeyException(msg)[source]
Bases:
SpackError
Raised when gpg has no default key added.
- exception spack.binary_distribution.NoOverwriteException(file_path)[source]
Bases:
SpackError
Raised when a file would be overwritten
- exception spack.binary_distribution.NoVerifyException(message, long_message=None)[source]
Bases:
SpackError
Raised if file fails signature verification.
- exception spack.binary_distribution.PickKeyException(keys)[source]
Bases:
SpackError
Raised when multiple keys can be used to sign.
- class spack.binary_distribution.PushOptions(force, relative, allow_root, regenerate_index, unsigned, key)[source]
Bases:
NamedTuple
- exception spack.binary_distribution.UnsignedPackageException(message, long_message=None)[source]
Bases:
SpackError
Raised if installation of unsigned package is attempted without the use of
--no-check-signature
.
- spack.binary_distribution.binary_index: BinaryCacheIndex | Singleton = <spack.binary_distribution.BinaryCacheIndex object>
Singleton binary_index instance
- spack.binary_distribution.binary_index_location()[source]
Set up a BinaryCacheIndex for remote buildcache dbs in the user’s homedir.
- spack.binary_distribution.buildinfo_file_name(prefix)[source]
Filename of the binary package meta-data file
- spack.binary_distribution.check_specs_against_mirrors(mirrors, specs, output_file=None)[source]
Check all the given specs against buildcaches on the given mirrors and determine if any of the specs need to be rebuilt. Specs need to be rebuilt when their hash doesn’t exist in the mirror.
- Parameters:
Returns: 1 if any spec was out-of-date on any mirror, 0 otherwise.
- spack.binary_distribution.dedupe_hardlinks_if_necessary(root, buildinfo)[source]
Updates a buildinfo dict for old archives that did not dedupe hardlinks. De-duping hardlinks is necessary when relocating files in parallel and in-place. This means we must preserve inodes when relocating.
- spack.binary_distribution.download_single_spec(concrete_spec, destination, mirror_url=None)[source]
Download the buildcache files for a single concrete spec.
- spack.binary_distribution.download_tarball(spec, unsigned=False, mirrors_for_spec=None)[source]
Download binary tarball for given package into stage area, returning path to downloaded tarball if successful, None otherwise.
- Parameters:
spec (spack.spec.Spec) – Concrete spec
unsigned (bool) – Whether or not to require signed binaries
mirrors_for_spec (list) – Optional list of concrete specs and mirrors obtained by calling binary_distribution.get_mirrors_for_spec(). These will be checked in order first before looking in other configured mirrors.
- Returns:
None
if the tarball could not be downloaded (maybe also verified, depending on whether new-style signed binary packages were found). Otherwise, return an object indicating the path to the downloaded tarball, the path to the downloaded specfile (in the case of new-style buildcache), and whether or not the tarball is already verified.
{ "tarball_path": "path-to-locally-saved-tarfile", "specfile_path": "none-or-path-to-locally-saved-specfile", "signature_verified": "true-if-binary-pkg-was-already-verified" }
- spack.binary_distribution.ensure_package_relocatable(buildinfo, binaries_dir)[source]
Check if package binaries are relocatable.
- spack.binary_distribution.extract_tarball(spec, download_result, unsigned=False, force=False)[source]
extract binary tarball for given package into install area
- spack.binary_distribution.generate_key_index(key_prefix, tmpdir=None)[source]
Create the key index page.
Creates (or replaces) the “index.json” page at the location given in key_prefix. This page contains an entry for each key (.pub) under key_prefix.
- spack.binary_distribution.generate_package_index(cache_prefix, concurrency=32)[source]
Create or replace the build cache index on the given mirror. The buildcache index contains an entry for each binary package under the cache_prefix.
- Parameters:
cache_prefix (str) – Base url of binary mirror.
concurrency – (int): The desired threading concurrency to use when fetching the spec files from the mirror.
- Returns:
None
- spack.binary_distribution.get_buildfile_manifest(spec)[source]
Return a data structure with information about a build, including text_to_relocate, binary_to_relocate, binary_to_relocate_fullpath link_to_relocate, and other, which means it doesn’t fit any of previous checks (and should not be relocated). We exclude docs (man) and metadata (.spack). This can be used to find a particular kind of file in spack, or to generate the build metadata.
- spack.binary_distribution.get_buildinfo_dict(spec, rel=False)[source]
Create metadata for a tarball
- spack.binary_distribution.get_keys(install=False, trust=False, force=False, mirrors=None)[source]
Get pgp public keys available on mirror with suffix .pub
- spack.binary_distribution.get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False)[source]
Check if concrete spec exists on mirrors and return a list indicating the mirrors on which it can be found
- Parameters:
spec (spack.spec.Spec) – The spec to look for in binary mirrors
mirrors_to_check (dict) – Optionally override the configured mirrors with the mirrors in this dictionary.
index_only (bool) – When
index_only
is set toTrue
, only the local cache is checked, no requests are made.
- Returns:
- A list of objects, each containing a
mirror_url
andspec
key indicating all mirrors where the spec can be found.
- A list of objects, each containing a
- spack.binary_distribution.gzip_compressed_tarfile(path)[source]
Create a reproducible, compressed tarfile
- spack.binary_distribution.hashes_to_prefixes(spec)[source]
Return a dictionary of hashes to prefixes for a spec and its deps, excluding externals
- spack.binary_distribution.install_root_node(spec, unsigned=False, force=False, sha256=None)[source]
Install the root node of a concrete spec from a buildcache.
Checking the sha256 sum of a node before installation is usually needed only for software installed during Spack’s bootstrapping (since we might not have a proper signature verification mechanism available).
- Parameters:
spec – spec to be installed (note that only the root node will be installed)
unsigned (bool) – if True allows installing unsigned binaries
force (bool) – force installation if the spec is already present in the local store
sha256 (str) – optional sha256 of the binary package, to be checked before installation
- spack.binary_distribution.install_single_spec(spec, unsigned=False, force=False)[source]
Install a single concrete spec from a buildcache.
- Parameters:
spec (spack.spec.Spec) – spec to be installed
unsigned (bool) – if True allows installing unsigned binaries
force (bool) – force installation if the spec is already present in the local store
- spack.binary_distribution.make_package_relative(workdir, spec, buildinfo, allow_root)[source]
Change paths in binaries to relative paths. Change absolute symlinks to relative symlinks.
- spack.binary_distribution.push(spec: Spec, mirror_url: str, options: PushOptions)[source]
Create and push binary package for a single spec to the specified mirror url.
- Parameters:
spec – Spec to package and push
mirror_url – Desired destination url for binary package
options –
- Returns:
True if package was pushed, False otherwise.
- spack.binary_distribution.push_keys(*mirrors, **kwargs)[source]
Upload pgp public keys to the given mirrors
- spack.binary_distribution.push_or_raise(spec: Spec, out_url: str, options: PushOptions)[source]
Build a tarball from given spec and put it into the directory structure used at the mirror (following <tarball_directory_name>).
This method raises
NoOverwriteException
whenforce=False
and the tarball or spec.json file already exist in the buildcache.
- spack.binary_distribution.specs_to_be_packaged(specs: List[Spec], root: bool = True, dependencies: bool = True) List[Spec] [source]
Return the list of nodes to be packaged, given a list of specs.
- Parameters:
specs – list of root specs to be processed
root – include the root of each spec in the nodes
dependencies – include the dependencies of each spec in the nodes
- spack.binary_distribution.tarball_directory_name(spec)[source]
Return name of the tarball directory according to the convention <os>-<architecture>/<compiler>/<package>-<version>/
- spack.binary_distribution.tarball_name(spec, ext)[source]
Return the name of the tarfile according to the convention <os>-<architecture>-<package>-<dag_hash><ext>
- spack.binary_distribution.tarball_path_name(spec, ext)[source]
Return the full path+name for a given spec according to the convention <tarball_directory_name>/<tarball_name>
- spack.binary_distribution.try_direct_fetch(spec, mirrors=None)[source]
Try to find the spec directly on the configured mirrors
- spack.binary_distribution.try_fetch(url_to_fetch)[source]
Utility function to try and fetch a file from a url, stage it locally, and return the path to the staged file.
- Parameters:
url_to_fetch (str) – Url pointing to remote resource to fetch
- Returns:
Path to locally staged resource or
None
if it could not be fetched.
- spack.binary_distribution.try_verify(specfile_path)[source]
Utility function to attempt to verify a local file. Assumes the file is a clearsigned signature file.
- Parameters:
specfile_path (str) – Path to file to be verified.
- Returns:
True
if the signature could be verified,False
otherwise.
- spack.binary_distribution.update_cache_and_get_specs()[source]
Get all concrete specs for build caches available on configured mirrors. Initialization of internal cache data structures is done as lazily as possible, so this method will also attempt to initialize and update the local index cache (essentially a no-op if it has been done already and nothing has changed on the configured mirrors.)
- Throws:
FetchCacheError
spack.build_environment module
This module contains all routines related to setting up the package build environment. All of this is set up by package.py just before install() is called.
There are two parts to the build environment:
Python build environment (i.e. install() method)
This is how things are set up when install() is called. Spack takes advantage of each package being in its own module by adding a bunch of command-like functions (like configure(), make(), etc.) in the package’s module scope. Ths allows package writers to call them all directly in Package.install() without writing ‘self.’ everywhere. No, this isn’t Pythonic. Yes, it makes the code more readable and more like the shell script from which someone is likely porting.
Build execution environment
This is the set of environment variables, like PATH, CC, CXX, etc. that control the build. There are also a number of environment variables used to pass information (like RPATHs and other information about dependencies) to Spack’s compiler wrappers. All of these env vars are also set up here.
Skimming this module is a nice way to get acquainted with the types of calls you can make from within the install() function.
- exception spack.build_environment.ChildError(msg, module, classname, traceback_string, log_name, log_type, context)[source]
Bases:
InstallError
- Special exception class for wrapping exceptions from child processes
in Spack’s build environment.
The main features of a ChildError are:
They’re serializable, so when a child build fails, we can send one of these to the parent and let the parent report what happened.
They have a
traceback
field containing a traceback generated on the child immediately after failure. Spack will print this on failure in lieu of trying to run sys.excepthook on the parent process, so users will see the correct stack trace from a child.They also contain context, which shows context in the Package implementation where the error happened. This helps people debug Python code in their packages. To get it, Spack searches the stack trace for the deepest frame where
self
is in scope and is an instance of PackageBase. This will generally find a useful spot in thepackage.py
file.
The long_message of a ChildError displays one of two things:
If the original error was a ProcessError, indicating a command died during the build, we’ll show context from the build log.
If the original error was any other type of error, we’ll show context from the Python code.
SpackError handles displaying the special traceback if we’re in debug mode with spack -d.
- build_errors = [('spack.util.executable', 'ProcessError')]
- property long_message
- class spack.build_environment.MakeExecutable(name, jobs, **kwargs)[source]
Bases:
Executable
Special callable executable object for make so the user can specify parallelism options on a per-invocation basis. Specifying ‘parallel’ to the call will override whatever the package’s global setting is, so you can either default to true or false and override particular calls. Specifying ‘jobs_env’ to a particular call will name an environment variable which will be set to the parallelism level (without affecting the normal invocation with -j).
- class spack.build_environment.ModuleChangePropagator(package)[source]
Bases:
object
Wrapper class to accept changes to a package.py Python module, and propagate them in the MRO of the package.
It is mainly used as a substitute of the
package.py
module, when calling the “setup_dependent_package” function during build environment setup.
- exception spack.build_environment.StopPhase(message, long_message=None)[source]
Bases:
SpackError
Pickle-able exception to control stopped builds.
- spack.build_environment.determine_number_of_jobs(parallel=False, command_line=None, config_default=None, max_cpus=None)[source]
Packages that require sequential builds need 1 job. Otherwise we use the number of jobs set on the command line. If not set, then we use the config defaults (which is usually set through the builtin config scope), but we cap to the number of CPUs available to avoid oversubscription.
- Parameters:
- spack.build_environment.get_effective_jobs(jobs, parallel=True, supports_jobserver=False)[source]
Return the number of jobs, or None if supports_jobserver and a jobserver is detected.
- spack.build_environment.get_package_context(traceback, context=3)[source]
Return some context for an error message when the build fails.
- Parameters:
traceback – A traceback from some exception raised during install
context (int) – Lines of context to show before and after the line where the error happened
This function inspects the stack to find where we failed in the package file, and it adds detailed context to the long_message from there.
- spack.build_environment.get_rpath_deps(pkg)[source]
Return immediate or transitive RPATHs depending on the package.
- spack.build_environment.jobserver_enabled()[source]
Returns true if a posix jobserver (make) is detected.
- spack.build_environment.load_external_modules(pkg)[source]
Traverse a package’s spec DAG and load any external modules.
Traverse a package’s dependencies and load any external modules associated with them.
- Parameters:
pkg (spack.package_base.PackageBase) – package to load deps for
- spack.build_environment.modifications_from_dependencies(spec, context, custom_mods_only=True, set_package_py_globals=True)[source]
Returns the environment modifications that are required by the dependencies of a spec and also applies modifications to this spec’s package at module scope, if need be.
Environment modifications include:
Updating PATH so that executables can be found
Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective tools can find Spack-built dependencies
Running custom package environment modifications
Custom package modifications can conflict with the default PATH changes we make (specifically for the PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH environment variables), so this applies changes in a fixed order:
All modifications (custom and default) from external deps first
All modifications from non-external deps afterwards
With that order, PrependPath actions from non-external default environment modifications will take precedence over custom modifications from external packages.
A secondary constraint is that custom and default modifications are grouped on a per-package basis: combined with the post-order traversal this means that default modifications of dependents can override custom modifications of dependencies (again, this would only occur for PATH, CMAKE_PREFIX_PATH, or PKG_CONFIG_PATH).
- Parameters:
spec (spack.spec.Spec) – spec for which we want the modifications
context (str) – either ‘build’ for build-time modifications or ‘run’ for run-time modifications
custom_mods_only (bool) – if True returns only custom modifications, if False returns custom and default modifications
set_package_py_globals (bool) – whether or not to set the global variables in the package.py files (this may be problematic when using buildcaches that have been built on a different but compatible OS)
- spack.build_environment.set_module_variables_for_package(pkg)[source]
Populate the Python module of a package with some useful global names. This makes things easier for package writers.
- spack.build_environment.set_wrapper_variables(pkg, env)[source]
Set environment variables used by the Spack compiler wrapper (which have the prefix SPACK_) and also add the compiler wrappers to PATH.
This determines the injected -L/-I/-rpath options; each of these specifies a search order and this function computes these options in a manner that is intended to match the DAG traversal order in modifications_from_dependencies: that method uses a post-order traversal so that PrependPath actions from dependencies take lower precedence; we use a post-order traversal here to match the visitation order of modifications_from_dependencies (so we are visiting the lowest priority packages first).
- spack.build_environment.setup_package(pkg, dirty, context='build')[source]
Execute all environment setup routines.
- spack.build_environment.start_build_process(pkg, function, kwargs)[source]
Create a child process to do part of a spack build.
- Parameters:
pkg (spack.package_base.PackageBase) – package whose environment we should set up the child process for.
function (Callable) – argless function to run in the child process.
Usage:
def child_fun(): # do stuff build_env.start_build_process(pkg, child_fun)
The child process is run with the build environment set up by spack.build_environment. This allows package authors to have full control over the environment, etc. without affecting other builds that might be executed in the same spack call.
If something goes wrong, the child process catches the error and passes it to the parent wrapped in a ChildError. The parent is expected to handle (or re-raise) the ChildError.
This uses multiprocessing.Process to create the child process. The mechanism used to create the process differs on different operating systems and for different versions of Python. In some cases “fork” is used (i.e. the “fork” system call) and some cases it starts an entirely new Python interpreter process (in the docs this is referred to as the “spawn” start method). Breaking it down by OS:
Linux always uses fork.
Mac OS uses fork before Python 3.8 and “spawn” for 3.8 and after.
Windows always uses the “spawn” start method.
For more information on multiprocessing child process creation mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
spack.builder module
- spack.builder.BUILDER_CLS = {'cmake': <class 'spack.build_systems.cmake.CMakeBuilder'>, 'meson': <class 'spack.build_systems.meson.MesonBuilder'>, 'python_pip': <class 'spack.build_systems.python.PythonPipBuilder'>}
Builder classes, as registered by the “builder” decorator
- class spack.builder.Builder(pkg)[source]
Bases:
Sequence
A builder is a class that, given a package object (i.e. associated with concrete spec), knows how to install it.
The builder behaves like a sequence, and when iterated over return the “phases” of the installation in the correct order.
- Parameters:
pkg (spack.package_base.PackageBase) – package object to be built
- archive_files: List[str] = []
List of glob expressions. Each expression must either be absolute or relative to the package source path. Matching artifacts found at the end of the build process will be copied in the same directory tree as _spack_build_logfile and _spack_build_envfile.
- property prefix
- setup_build_environment(env)[source]
Sets up the build environment for a package.
This method will be called before the current package prefix exists in Spack’s store.
- Parameters:
env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the package is built. Package authors can call methods on it to alter the build environment.
- setup_dependent_build_environment(env, dependent_spec)[source]
Sets up the build environment of packages that depend on this one.
This is similar to
setup_build_environment
, but it is used to modify the build environments of packages that depend on this one.This gives packages like Python and others that follow the extension model a way to implement common environment or compile-time settings for dependencies.
This method will be called before the dependent package prefix exists in Spack’s store.
Examples
1. Installing python modules generally requires
PYTHONPATH
to point to thelib/pythonX.Y/site-packages
directory in the module’s install prefix. This method could be used to set that variable.- Parameters:
env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the dependent package is built. Package authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec) – the spec of the dependent package about to be built. This allows the extendee (self) to query the dependent’s state. Note that this package’s spec is available as
self.spec
- property spec
- property stage
- class spack.builder.BuilderMeta(name, bases, attr_dict)[source]
Bases:
PhaseCallbacksMeta
,ABCMeta
- class spack.builder.CallbackTemporaryStage(attribute_name, callbacks)
Bases:
tuple
An object of this kind is a shared global state used to collect callbacks during class definition time, and is flushed when the class object is created at the end of the class definition
- Parameters:
- attribute_name
Alias for field number 0
- callbacks
Alias for field number 1
- class spack.builder.InstallationPhase(name, builder)[source]
Bases:
object
Manages a single phase of the installation.
This descriptor stores at creation time the name of the method it should search for execution. The method is retrieved at __get__ time, so that it can be overridden by subclasses of whatever class declared the phases.
It also provides hooks to execute arbitrary callbacks before and after the phase.
- class spack.builder.PhaseCallbacksMeta(name, bases, attr_dict)[source]
Bases:
type
Permit to register arbitrary functions during class definition and run them later, before or after a given install phase.
Each method decorated with
run_before
orrun_after
gets temporarily stored in a global shared state when a class being defined is parsed by the Python interpreter. At class definition time that temporary storage gets flushed and a list of callbacks is attached to the class being defined.
- spack.builder.builder(build_system_name)[source]
Class decorator used to register the default builder for a given build-system.
- Parameters:
build_system_name (str) – name of the build-system
- spack.builder.buildsystem_name(pkg)[source]
Given a package object with an associated concrete spec, return the name of its build system.
- Parameters:
pkg (spack.package_base.PackageBase) – package for which we want the build system name
- spack.builder.create(pkg)[source]
Given a package object with an associated concrete spec, return the builder object that can install it.
- Parameters:
pkg (spack.package_base.PackageBase) – package for which we want the builder
- spack.builder.run_after(phase, when=None)
Decorator to register a function for running after a given phase.
spack.caches module
Caches used by Spack to store data
- class spack.caches.MirrorCache(root, skip_unstable_versions)[source]
Bases:
object
- spack.caches.fetch_cache: FsCache | Singleton = <spack.fetch_strategy.FsCache object>
Spack’s local cache for downloaded source archives
- spack.caches.fetch_cache_location()[source]
Filesystem cache of downloaded archives.
This prevents Spack from repeatedly fetch the same files when building the same package different ways or multiple times.
spack.ci module
- class spack.ci.CDashHandler(ci_cdash)[source]
Bases:
object
Class for managing CDash data and processing.
- property build_name
Returns the CDash build name.
A name will be generated if the current_spec property is set; otherwise, the value will be retrieved from the environment through the SPACK_CDASH_BUILD_NAME variable.
Returns: (str) current spec’s CDash build name.
- property build_stamp
Returns the CDash build stamp.
The one defined by SPACK_CDASH_BUILD_STAMP environment variable is preferred due to the representation of timestamps; otherwise, one will be built.
Returns: (str) current CDash build stamp
- property project_enc
- report_skipped(spec: Spec, report_dir: str, reason: str | None)[source]
Explicitly report skipping testing of a spec (e.g., it’s CI configuration identifies it as known to have broken tests or the CI installation failed).
- Parameters:
spec – spec being tested
report_dir – directory where the report will be written
reason – reason the test is being skipped
- property upload_url
- class spack.ci.PushResult(success, url)
Bases:
tuple
- success
Alias for field number 0
- url
Alias for field number 1
- class spack.ci.SpackCI(ci_config, phases, staged_phases)[source]
Bases:
object
Spack CI object used to generate intermediate representation used by the CI generator(s).
- spack.ci.can_sign_binaries()[source]
Utility method to determine if this spack instance is capable of signing binary packages. This is currently only possible if the spack gpg keystore contains exactly one secret key.
- spack.ci.can_verify_binaries()[source]
Utility method to determin if this spack instance is capable (at least in theory) of verifying signed binaries.
- spack.ci.compute_affected_packages(rev1='HEAD^', rev2='HEAD')[source]
Determine which packages were added, removed or changed between rev1 and rev2, and return the names as a set
- spack.ci.configure_compilers(compiler_action, scope=None)[source]
- Depending on the compiler_action parameter, either turn on the
install_missing_compilers config option, or find spack compilers, or do nothing. This is used from rebuild jobs in bootstrapping pipelines, where in the bootsrapping phase we would pass FIND_ANY in case of compiler-agnostic bootstrapping, while in the spec building phase we would pass INSTALL_MISSING in order to get spack to use the compiler which was built in the previous phase and is now sitting in the binary mirror.
- Parameters:
compiler_action (str) – ‘FIND_ANY’, ‘INSTALL_MISSING’ have meanings described above. Any other value essentially results in a no-op.
scope (spack.config.ConfigScope) – Optional. The scope in which to look for compilers, in case ‘FIND_ANY’ was provided.
- spack.ci.copy_files_to_artifacts(src, artifacts_dir)[source]
Copy file(s) to the given artifacts directory
- spack.ci.copy_stage_logs_to_artifacts(job_spec: Spec, job_log_dir: str) None [source]
Copy selected build stage file(s) to the given artifacts directory
Looks for build logs in the stage directory of the given job_spec, and attempts to copy the files into the directory given by job_log_dir.
- Parameters:
job_spec – spec associated with spack install log
job_log_dir – path into which build log should be copied
- spack.ci.copy_test_logs_to_artifacts(test_stage, job_test_dir)[source]
Copy test log file(s) to the given artifacts directory
- spack.ci.create_buildcache(input_spec: Spec, *, pr_pipeline: bool, pipeline_mirror_url: str | None = None, buildcache_mirror_url: str | None = None) List[PushResult] [source]
Create the buildcache at the provided mirror(s).
- Parameters:
input_spec – Installed spec to package and push
buildcache_mirror_url – URL for the buildcache mirror
pipeline_mirror_url – URL for the pipeline mirror
pr_pipeline – True if the CI job is for a PR
Returns: A list of PushResults, indicating success or failure.
- spack.ci.display_broken_spec_messages(base_url, hashes)[source]
Fetch the broken spec file for each of the hashes under the base_url and print a message with some details about each one.
- spack.ci.download_and_extract_artifacts(url, work_dir)[source]
- Look for gitlab artifacts.zip at the given url, and attempt to download
and extract the contents into the given work_dir
- spack.ci.generate_gitlab_ci_yaml(env, print_summary, output_file, prune_dag=False, check_index_only=False, run_optimizer=False, use_dependencies=False, artifacts_root=None, remote_mirror_override=None)[source]
- Generate a gitlab yaml file to run a dynamic child pipeline from
the spec matrix in the active environment.
- Parameters:
env (spack.environment.Environment) – Activated environment object which must contain a gitlab-ci section describing how to map specs to runners
print_summary (bool) – Should we print a summary of all the jobs in the stages in which they were placed.
output_file (str) – File path where generated file should be written
prune_dag (bool) – If True, do not generate jobs for specs already exist built on the mirror.
check_index_only (bool) – If True, attempt to fetch the mirror index and only use that to determine whether built specs on the mirror this mode results in faster yaml generation time). Otherwise, also check each spec directly by url (useful if there is no index or it might be out of date).
run_optimizer (bool) – If True, post-process the generated yaml to try try to reduce the size (attempts to collect repeated configuration and replace with definitions).)
use_dependencies (bool) – If true, use “dependencies” rather than “needs” (“needs” allows DAG scheduling). Useful if gitlab instance cannot be configured to handle more than a few “needs” per job.
artifacts_root (str) – Path where artifacts like logs, environment files (spack.yaml, spack.lock), etc should be written. GitLab requires this to be within the project directory.
remote_mirror_override (str) – Typically only needed when one spack.yaml is used to populate several mirrors with binaries, based on some criteria. Spack protected pipelines populate different mirrors based on branch name, facilitated by this option.
- spack.ci.get_change_revisions()[source]
If this is a git repo get the revisions to use when checking for changed packages and spack core modules.
- spack.ci.get_job_name(phase, strip_compiler, spec, osarch, build_group)[source]
Given the necessary parts, format the gitlab job name
- Parameters:
phase (str) – Either ‘specs’ for the main phase, or the name of a bootstrapping phase
strip_compiler (bool) – Should compiler be stripped from job name
spec (spack.spec.Spec) – Spec job will build
osarch – Architecture TODO: (this is a spack.spec.ArchSpec, but sphinx doesn’t recognize the type and fails).
build_group (str) – Name of build group this job belongs to (a CDash
notion) –
Returns: The job name
- spack.ci.get_spack_info()[source]
If spack is running from a git repo, return the most recent git log entry, otherwise, return a string containing the spack version.
- spack.ci.get_spec_filter_list(env, affected_pkgs, dependent_traverse_depth=None)[source]
- Given a list of package names and an active/concretized
environment, return the set of all concrete specs from the environment that could have been affected by changing the list of packages.
If a
dependent_traverse_depth
is given, it is used to limit upward (in the parent direction) traversal of specs of touched packages. E.g. if 1 is provided, then only direct dependents of touched package specs are traversed to produce specs that could have been affected by changing the package, while if 0 is provided, only the changed specs themselves are traversed. IfNone
is given, upward traversal of touched package specs is done all the way to the environment roots. Providing a negative number results in no traversals at all, yielding an empty set.
- Parameters:
env (spack.environment.Environment) – Active concrete environment
affected_pkgs (List[str]) – Affected package names
dependent_traverse_depth – Optional integer to limit dependent traversal, or None to disable the limit.
- Returns:
A set of concrete specs from the active environment including those associated with affected packages, their dependencies and dependents, as well as their dependents dependencies.
- spack.ci.get_stack_changed(env_path, rev1='HEAD^', rev2='HEAD')[source]
Given an environment manifest path and two revisions to compare, return whether or not the stack was changed. Returns True if the environment manifest changed between the provided revisions (or additionally if the .gitlab-ci.yml file itself changed). Returns False otherwise.
- spack.ci.import_signing_key(base64_signing_key)[source]
- Given Base64-encoded gpg key, decode and import it to use for
signing packages.
- Parameters:
base64_signing_key (str) – A gpg key including the secret key, armor-exported and base64 encoded, so it can be stored in a gitlab CI variable. For an example of how to generate such a key, see:
https – //github.com/spack/spack-infrastructure/blob/main/gitlab-docker/files/gen-key
- spack.ci.process_command(name, commands, repro_dir)[source]
Create a script for and run the command. Copy the script to the reproducibility directory.
- Parameters:
Returns: the exit code from processing the command
- spack.ci.push_mirror_contents(input_spec: Spec, mirror_url, sign_binaries)[source]
Push one or more binary packages to the mirror.
- Parameters:
input_spec (spack.spec.Spec) – Installed spec to push
mirror_url (str) – Base url of target mirror
sign_binaries (bool) – If True, spack will attempt to sign binary package before pushing.
- spack.ci.read_broken_spec(broken_spec_url)[source]
Read data from broken specs file located at the url, return as a yaml object.
- spack.ci.remove_other_mirrors(mirrors_to_keep, scope=None)[source]
Remove all mirrors from the given config scope, the exceptions being any listed in in mirrors_to_keep, which is a list of mirror urls.
- spack.ci.reproduce_ci_job(url, work_dir)[source]
Given a url to gitlab artifacts.zip from a failed ‘spack ci rebuild’ job, attempt to setup an environment in which the failure can be reproduced locally. This entails the following:
First download and extract artifacts. Then look through those artifacts to glean some information needed for the reproduer (e.g. one of the artifacts contains information about the version of spack tested by gitlab, another is the generated pipeline yaml containing details of the job like the docker image used to run it). The output of this function is a set of printed instructions for running docker and then commands to run to reproduce the build once inside the container.
- spack.ci.run_standalone_tests(**kwargs)[source]
Run stand-alone tests on the current spec.
- Parameters:
kwargs (dict) – dictionary of arguments used to run the tests
List of recognized keys:
“cdash” (CDashHandler): (optional) cdash handler instance
“fail_fast” (bool): (optional) terminate tests after the first failure
“log_file” (str): (optional) test log file name if NOT CDash reporting
“job_spec” (Spec): spec that was built
“repro_dir” (str): reproduction directory
- spack.ci.setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None)[source]
- Look in the local spack clone to find the checkout_commit, and if
provided, the merge_commit given as arguments. If those commits can be found locally, then clone spack and attempt to recreate a merge commit with the same parent commits as tested in gitlab. This looks something like 1) git clone repo && cd repo 2) git checkout <checkout_commit> 3) git merge <merge_commit>. If there is no merge_commit provided, then skip step (3).
- Parameters:
- Returns: True if git repo state was successfully recreated, or False
otherwise.
- spack.ci.stage_spec_jobs(specs, check_index_only=False, mirrors_to_check=None)[source]
- Take a set of release specs and generate a list of “stages”, where the
jobs in any stage are dependent only on jobs in previous stages. This allows us to maximize build parallelism within the gitlab-ci framework.
- Parameters:
specs (Iterable) – Specs to build
check_index_only (bool) – Regardless of whether DAG pruning is enabled, all configured mirrors are searched to see if binaries for specs are up to date on those mirrors. This flag limits that search to the binary cache indices on those mirrors to speed the process up, even though there is no garantee the index is up to date.
mirrors_to_checK – Optional mapping giving mirrors to check instead of any configured mirrors.
- Returns: A tuple of information objects describing the specs, dependencies
and stages:
- spec_labels: A dictionary mapping the spec labels which are made of
(pkg-name/hash-prefix), to objects containing “spec” and “needs_rebuild” keys. The root spec is the spec of which this spec is a dependency and the spec is the formatted spec string for this spec.
- deps: A dictionary where the keys should also have appeared as keys in
the spec_labels dictionary, and the values are the set of dependencies for that spec.
- stages: An ordered list of sets, each of which contains all the jobs to
built in that stage. The jobs are expressed in the same format as the keys in the spec_labels and deps objects.
spack.ci_needs_workaround module
- spack.ci_needs_workaround.get_job_name(needs_entry)
spack.ci_optimization module
- spack.ci_optimization.add_extends(yaml, key)[source]
Modifies the given object “yaml” so that it includes an “extends” key whose value features “key”.
If “extends” is not in yaml, then yaml is modified such that yaml[“extends”] == key.
If yaml[“extends”] is a str, then yaml is modified such that yaml[“extends”] == [yaml[“extends”], key]
If yaml[“extends”] is a list that does not include key, then key is appended to the list.
Otherwise, yaml is left unchanged.
- spack.ci_optimization.build_histogram(iterator, key)[source]
Builds a histogram of values given an iterable of mappings and a key.
For each mapping “m” with key “key” in iterator, the value m[key] is considered.
Returns a list of tuples (hash, count, proportion, value), where
“hash” is a sha1sum hash of the value.
“count” is the number of occurences of values that hash to “hash”.
“proportion” is the proportion of all values considered above that hash to “hash”.
“value” is one of the values considered above that hash to “hash”. Which value is chosen when multiple values hash to the same “hash” is undefined.
The list is sorted in descending order by count, yielding the most frequently occuring hashes first.
- spack.ci_optimization.common_subobject(yaml, sub)[source]
Factor prototype object “sub” out of the values of mapping “yaml”.
Consider a modified copy of yaml, “new”, where for each key, “key” in yaml:
If yaml[key] matches sub, then new[key] = subkeys(yaml[key], sub).
Otherwise, new[key] = yaml[key].
If the above match criteria is not satisfied for any such key, then (yaml, None) is returned. The yaml object is returned unchanged.
Otherwise, each matching value in new is modified as in add_extends(new[key], common_key), and then new[common_key] is set to sub. The common_key value is chosen such that it does not match any preexisting key in new. In this case, (new, common_key) is returned.
- spack.ci_optimization.matches(obj, proto)[source]
Returns True if the test object “obj” matches the prototype object “proto”.
If obj and proto are mappings, obj matches proto if (key in obj) and (obj[key] matches proto[key]) for every key in proto.
If obj and proto are sequences, obj matches proto if they are of the same length and (a matches b) for every (a,b) in zip(obj, proto).
Otherwise, obj matches proto if obj == proto.
Precondition: proto must not have any reference cycles
- spack.ci_optimization.subkeys(obj, proto)[source]
Returns the test mapping “obj” after factoring out the items it has in common with the prototype mapping “proto”.
Consider a recursive merge operation, merge(a, b) on mappings a and b, that returns a mapping, m, whose keys are the union of the keys of a and b, and for every such key, “k”, its corresponding value is:
merge(a[key], b[key]) if a[key] and b[key] are mappings, or
- b[key] if (key in b) and not matches(a[key], b[key]),
or
a[key] otherwise
If obj and proto are mappings, the returned object is the smallest object, “a”, such that merge(a, proto) matches obj.
Otherwise, obj is returned.
- spack.ci_optimization.try_optimization_pass(name, yaml, optimization_pass, *args, **kwargs)[source]
Try applying an optimization pass and return information about the result
“name” is a string describing the nature of the pass. If it is a non-empty string, summary statistics are also printed to stdout.
“yaml” is the object to apply the pass to.
“optimization_pass” is the function implementing the pass to be applied.
“args” and “kwargs” are the additional arguments to pass to optimization pass. The pass is applied as
>>> (new_yaml, *other_results) = optimization_pass(yaml, *args, **kwargs)
The pass’s results are greedily rejected if it does not modify the original yaml document, or if it produces a yaml document that serializes to a larger string.
Returns (new_yaml, yaml, applied, other_results) if applied, or (yaml, new_yaml, applied, other_results) otherwise.
spack.compiler module
- class spack.compiler.Compiler(cspec, operating_system, target, paths, modules=None, alias=None, environment=None, extra_rpaths=None, enable_implicit_rpaths=None, **kwargs)[source]
Bases:
object
This class encapsulates a Spack “compiler”, which includes C, C++, and Fortran compilers. Subclasses should implement support for specific compilers, their possible names, arguments, and how to identify the particular type of compiler.
- property c11_flag
- property c99_flag
- property cc_pic_flag
Returns the flag used by the C compiler to produce Position Independent Code (PIC).
- property cc_rpath_arg
- property cxx11_flag
- property cxx14_flag
- property cxx17_flag
- property cxx98_flag
- property cxx_pic_flag
Returns the flag used by the C++ compiler to produce Position Independent Code (PIC).
- property cxx_rpath_arg
- property debug_flags
- classmethod default_version(cc)[source]
Override just this to override all compiler version functions.
- property disable_new_dtags
- property enable_new_dtags
- classmethod extract_version_from_output(output)[source]
Extracts the version from compiler’s output.
- property f77_pic_flag
Returns the flag used by the F77 compiler to produce Position Independent Code (PIC).
- property f77_rpath_arg
- property fc_pic_flag
Returns the flag used by the FC compiler to produce Position Independent Code (PIC).
- property fc_rpath_arg
- get_real_version()[source]
Query the compiler for its version.
This is the “real” compiler version, regardless of what is in the compilers.yaml file, which the user can change to name their compiler.
Use the runtime environment of the compiler (modules and environment modifications) to enable the compiler to run properly on any platform.
- ignore_version_errors: Sequence[int] = ()
Return values to ignore when invoking the compiler to get its version
- property linker_arg
Flag that need to be used to pass an argument to the linker.
- property openmp_flag
- property opt_flags
- property prefix
Query the compiler for its install prefix. This is the install path as reported by the compiler. Note that paths for cc, cxx, etc are not enough to find the install prefix of the compiler, since the can be symlinks, wrappers, or filenames instead of absolute paths.
- property real_version
Executable reported compiler version used for API-determinations
E.g. C++11 flag checks.
- property required_libs
For executables created with this compiler, the compiler libraries that would be generally required to run it.
- setup_custom_environment(pkg, env)[source]
Set any environment variables necessary to use the compiler.
- suffixes = ['-.*']
- property verbose_flag
This property should be overridden in the compiler subclass if a verbose flag is available.
If it is not overridden, it is assumed to not be supported.
- verify_executables()[source]
Raise an error if any of the compiler executables is not valid.
This method confirms that for all of the compilers (cc, cxx, f77, fc) that have paths, those paths exist and are executable by the current user. Raises a CompilerAccessError if any of the non-null paths for the compiler are not accessible.
- property version
- version_argument = '-dumpversion'
Compiler argument that produces version information
- version_regex = '(.*)'
Regex used to extract version from compiler’s output
spack.concretize module
Functions here are used to take abstract specs and make them concrete. For example, if a spec asks for a version between 1.8 and 1.9, these functions might take will take the most recent 1.9 version of the package available. Or, if the user didn’t specify a compiler for a spec, then this will assign a compiler to the spec based on defaults or user preferences.
- TODO: make this customizable and allow users to configure
concretization policies.
- class spack.concretize.Concretizer(abstract_spec=None)[source]
Bases:
object
You can subclass this class to override some of the default concretization strategies, or you can override all of them.
- adjust_target(spec)[source]
Adjusts the target microarchitecture if the compiler is too old to support the default one.
- Parameters:
spec – spec to be concretized
- Returns:
True if spec was modified, False otherwise
- check_for_compiler_existence = None
Controls whether we check that compiler versions actually exist during concretization. Used for testing and for mirror creation
- choose_virtual_or_external(spec)[source]
Given a list of candidate virtual and external packages, try to find one that is most ABI compatible.
- concretize_architecture(spec)[source]
If the spec is empty provide the defaults of the platform. If the architecture is not a string type, then check if either the platform, target or operating system are concretized. If any of the fields are changed then return True. If everything is concretized (i.e the architecture attribute is a namedtuple of classes) then return False. If the target is a string type, then convert the string into a concretized architecture. If it has no architecture and the root of the DAG has an architecture, then use the root otherwise use the defaults on the platform.
- concretize_compiler(spec)[source]
If the spec already has a compiler, we’re done. If not, then take the compiler used for the nearest ancestor with a compiler spec and use that. If the ancestor’s compiler is not concrete, then used the preferred compiler as specified in spackconfig.
Intuition: Use the spackconfig default if no package that depends on this one has a strict compiler requirement. Otherwise, try to build with the compiler that will be used by libraries that link to this one, to maximize compatibility.
- concretize_compiler_flags(spec)[source]
The compiler flags are updated to match those of the spec whose compiler is used, defaulting to no compiler flags in the spec. Default specs set at the compiler level will still be added later.
- concretize_variants(spec)[source]
If the spec already has variants filled in, return. Otherwise, add the user preferences from packages.yaml or the default variants from the package specification.
- concretize_version(spec)[source]
If the spec is already concrete, return. Otherwise take the preferred version from spackconfig, and default to the package’s version if there are no available versions.
- TODO: In many cases we probably want to look for installed
versions of each package and use an installed version if we can link to it. The policy implemented here will tend to rebuild a lot of stuff becasue it will prefer a compiler in the spec to any compiler already- installed things were built with. There is likely some better policy that finds some middle ground between these two extremes.
- exception spack.concretize.InsufficientArchitectureInfoError(spec, archs)[source]
Bases:
SpackError
Raised when details on architecture cannot be collected from the system
- exception spack.concretize.NoBuildError(spec)[source]
Bases:
SpecError
Raised when a package is configured with the buildable option False, but no satisfactory external versions can be found
- exception spack.concretize.NoCompilersForArchError(arch, available_os_targets)[source]
Bases:
SpackError
- exception spack.concretize.NoValidVersionError(spec)[source]
Bases:
SpackError
Raised when there is no way to have a concrete version for a particular spec.
Bases:
SpackError
Raised when there is no available compiler that satisfies a compiler spec.
- spack.concretize.concretize_specs_together(*abstract_specs, **kwargs)[source]
Given a number of specs as input, tries to concretize them together.
spack.config module
This module implements Spack’s configuration file handling.
This implements Spack’s configuration system, which handles merging multiple scopes with different levels of precedence. See the documentation on Configuration Scopes for details on how Spack’s configuration system behaves. The scopes are:
default
system
site
user
And corresponding per-platform scopes. Important functions in this module are:
get_config
reads in YAML data for a particular scope and returns
it. Callers can then modify the data and write it back with
update_config
.
When read in, Spack validates configurations with jsonschemas. The
schemas are in submodules of spack.schema
.
- exception spack.config.ConfigError(message, long_message=None)[source]
Bases:
SpackError
Superclass for all Spack config related errors.
- exception spack.config.ConfigFileError(message, long_message=None)[source]
Bases:
ConfigError
Issue reading or accessing a configuration file.
- exception spack.config.ConfigFormatError(validation_error, data, filename=None, line=None)[source]
Bases:
ConfigError
Raised when a configuration format does not match its schema.
- class spack.config.ConfigScope(name, path)[source]
Bases:
object
This class represents a configuration scope.
A scope is one directory containing named configuration files. Each file is a config “section” (e.g., mirrors, compilers, etc).
- property is_platform_dependent
- exception spack.config.ConfigSectionError(message, long_message=None)[source]
Bases:
ConfigError
Error for referring to a bad config section name in a configuration.
- class spack.config.Configuration(*scopes: ConfigScope)[source]
Bases:
object
A full Spack configuration, from a hierarchy of config files.
This class makes it easy to add a new scope on top of an existing one.
- clear_caches()[source]
Clears the caches for configuration files,
This will cause files to be re-read upon the next request.
- property file_scopes: List[ConfigScope]
List of writable scopes with an associated file.
- get(path, default=None, scope=None)[source]
Get a config section or a single value from one.
Accepts a path syntax that allows us to grab nested config map entries. Getting the ‘config’ section would look like:
spack.config.get('config')
and the
dirty
section in theconfig
scope would be:spack.config.get('config:dirty')
We use
:
as the separator, like YAML objects.
- get_config(section, scope=None)[source]
Get configuration settings for a section.
If
scope
isNone
or not provided, return the merged contents of all of Spack’s configuration scopes. Ifscope
is provided, return only the configuration as specified in that scope.This off the top-level name from the YAML section. That is, for a YAML config file that looks like this:
config: install_tree: root: $spack/opt/spack build_stage: - $tmpdir/$user/spack-stage
get_config('config')
will return:{ 'install_tree': { 'root': '$spack/opt/spack', } 'build_stage': ['$tmpdir/$user/spack-stage'] }
- get_config_filename(scope, section) str [source]
For some scope and section, get the name of the configuration file.
- highest_precedence_non_platform_scope() ConfigScope [source]
Non-internal non-platform scope with highest precedence
Platform-specific scopes are of the form scope/platform
- highest_precedence_scope() ConfigScope [source]
Non-internal scope with highest precedence.
- matching_scopes(reg_expr) List[ConfigScope] [source]
List of all scopes whose names match the provided regular expression.
For example, matching_scopes(r’^command’) will return all scopes whose names begin with command.
- pop_scope() ConfigScope [source]
Remove the highest precedence scope and return it.
- push_scope(scope: ConfigScope)[source]
Add a higher precedence scope to the Configuration.
- remove_scope(scope_name: str) ConfigScope | None [source]
Remove scope by name; has no effect when
scope_name
does not exist
- scopes: Dict[str, ConfigScope]
- set(path, value, scope=None)[source]
Convenience function for setting single values in config files.
Accepts the path syntax described in
get()
.
- update_config(section: str, update_data: Dict, scope: str | None = None, force: bool = False)[source]
Update the configuration file for a particular scope.
Overwrites contents of a section in a scope with update_data, then writes out the config file.
update_data should have the top-level section name stripped off (it will be re-added). Data itself can be a list, dict, or any other yaml-ish structure.
Configuration scopes that are still written in an old schema format will fail to update unless
force
is True.
- class spack.config.ImmutableConfigScope(name, path)[source]
Bases:
ConfigScope
A configuration scope that cannot be written to.
This is used for ConfigScopes passed on the command line.
- class spack.config.InternalConfigScope(name, data=None)[source]
Bases:
ConfigScope
An internal configuration scope that is not persisted to a file.
This is for spack internal use so that command-line options and config file settings are accessed the same way, and Spack can easily override settings from files.
- class spack.config.SingleFileScope(name, path, schema, yaml_path=None)[source]
Bases:
ConfigScope
This class represents a configuration scope in a single YAML file.
- property is_platform_dependent
- spack.config.add(fullpath, scope=None)[source]
Add the given configuration to the specified config scope. Add accepts a path. If you want to add from a filename, use add_from_file
- spack.config.collect_urls(base_url: str) list [source]
Return a list of configuration URLs.
- Parameters:
base_url – URL for a configuration (yaml) file or a directory containing yaml file(s)
- Returns:
List of configuration file(s) or empty list if none
- spack.config.command_line_scopes: List[str] = []
configuration scopes added on the command line set by
spack.main.main()
.
- spack.config.config: Configuration | Singleton = <spack.config.Configuration object>
This is the singleton configuration instance for Spack.
- spack.config.config_defaults = {'config': {'build_jobs': 2, 'build_stage': '$tempdir/spack-stage', 'checksum': True, 'concretizer': 'clingo', 'connect_timeout': 10, 'debug': False, 'dirty': False, 'license_dir': '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/etc/spack/licenses', 'verify_ssl': True}}
Hard-coded default values for some key configuration options. This ensures that Spack will still work even if config.yaml in the defaults scope is removed.
- spack.config.configuration_defaults_path = ('defaults', '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/etc/spack/defaults')
Path to the default configuration
- spack.config.default_list_scope()[source]
Return the config scope that is listed by default.
Commands that list configuration list all scopes (merged) by default.
- spack.config.default_modify_scope(section='config')[source]
Return the config scope that commands should modify by default.
Commands that modify configuration by default modify the highest priority scope.
- Parameters:
section (bool) – Section for which to get the default scope. If this is not ‘compilers’, a general (non-platform) scope is used.
- spack.config.ensure_latest_format_fn(section)[source]
Return a function that takes as input a dictionary read from a configuration file and update it to the latest format.
The function returns True if there was any update, False otherwise.
- Parameters:
section (str) – section of the configuration e.g. “packages”, “config”, etc.
- spack.config.fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) str [source]
Retrieve configuration file(s) at the specified URL.
- Parameters:
url – URL for a configuration (yaml) file or a directory containing yaml file(s)
dest_dir – destination directory
skip_existing – Skip files that already exist in dest_dir if
True
; otherwise, replace those files
- Returns:
Path to the corresponding file if URL is or contains a single file and it is the only file in the destination directory or the root (dest_dir) directory if multiple configuration files exist or are retrieved.
- spack.config.first_existing(dictionary, keys)[source]
Get the value of the first key in keys that is in the dictionary.
- spack.config.get(path, default=None, scope=None)[source]
Module-level wrapper for
Configuration.get()
.
- spack.config.get_valid_type(path)[source]
Returns an instance of a type that will pass validation for path.
The instance is created by calling the constructor with no arguments. If multiple types will satisfy validation for data at the configuration path given, the priority order is
list
,dict
,str
,bool
,int
,float
.
- spack.config.merge_yaml(dest, source, prepend=False, append=False)[source]
Merges source into dest; entries in source take precedence over dest.
This routine may modify dest and should be assigned to dest, in case dest was None to begin with, e.g.:
dest = merge_yaml(dest, source)
In the result, elements from lists from
source
will appear before elements of lists fromdest
. Likewise, when iterating over keys or items in mergedOrderedDict
objects, keys fromsource
will appear before keys fromdest
.Config file authors can optionally end any attribute in a dict with :: instead of :, and the key will override that of the parent instead of merging.
+: will extend the default prepend merge strategy to include string concatenation -: will change the merge strategy to append, it also includes string concatentation
- spack.config.override(path_or_scope, value=None)[source]
Simple way to override config settings within a context.
- Parameters:
path_or_scope (ConfigScope or str) – scope or single option to override
value (object or None) – value for the single option
Temporarily push a scope on the current configuration, then remove it after the context completes. If a single option is provided, create an internal config scope for it and push/pop that scope.
- spack.config.overrides_base_name = 'overrides-'
Base name for the (internal) overrides scope.
- spack.config.raw_github_gitlab_url(url)[source]
Transform a github URL to the raw form to avoid undesirable html.
- Parameters:
url – url to be converted to raw form
Returns: (str) raw github/gitlab url or the original url
- spack.config.read_config_file(filename, schema=None)[source]
Read a YAML configuration file.
User can provide a schema for validation. If no schema is provided, we will infer the schema from the top-level key.
- spack.config.remove_yaml(dest, source)[source]
UnMerges source from dest; entries in source take precedence over dest.
This routine may modify dest and should be assigned to dest, in case dest was None to begin with, e.g.:
dest = remove_yaml(dest, source)
In the result, elements from lists from
source
will not appear as elements of lists fromdest
. Likewise, when iterating over keys or items in mergedOrderedDict
objects, keys fromsource
will not appear as keys indest
.Config file authors can optionally end any attribute in a dict with :: instead of :, and the key will remove the entire section from
dest
- spack.config.scopes_metavar = '{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT'
metavar to use for commands that accept scopes this is shorter and more readable than listing all choices
- spack.config.section_schemas = {'bootstrap': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'bootstrap': {'properties': {'enable': {'type': 'boolean'}, 'root': {'type': 'string'}, 'sources': {'items': {'additionalProperties': False, 'properties': {'metadata': {'type': 'string'}, 'name': {'type': 'string'}}, 'required': ['name', 'metadata'], 'type': 'object'}, 'type': 'array'}, 'trusted': {'patternProperties': {'\\w[\\w-]*': {'type': 'boolean'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack bootstrap configuration file schema', 'type': 'object'}, 'cdash': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'cdash': {'additionalProperties': False, 'patternProperties': {'build-group': {'type': 'string'}, 'project': {'type': 'string'}, 'site': {'type': 'string'}, 'url': {'type': 'string'}}, 'required': ['build-group'], 'type': 'object'}}, 'title': 'Spack cdash configuration file schema', 'type': 'object'}, 'ci': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'ci': {'oneOf': [{'anyOf': [{'type': 'object', 'additionalProperties': False, 'properties': {'pipeline-gen': {'type': 'array', 'items': {'oneOf': [{'type': 'object', 'additionalProperties': False, 'required': ['submapping'], 'properties': {'match_behavior': {'type': 'string', 'enum': ['first', 'merge'], 'default': 'first'}, 'submapping': {'type': 'array', 'items': {'type': 'object', 'additionalProperties': False, 'required': ['match'], 'properties': {'match': {'type': 'array', 'items': {'type': 'string'}}, 'build-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'build-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}}}}, {'oneOf': [{'type': 'object', 'additionalProperties': False, 'properties': {'noop-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'noop-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'build-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'build-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'copy-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'copy-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'reindex-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'reindex-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'signing-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'signing-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'cleanup-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'cleanup-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'any-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'any-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}]}]}}, 'bootstrap': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'object', 'additionalProperties': False, 'required': ['name'], 'properties': {'name': {'type': 'string'}, 'compiler-agnostic': {'type': 'boolean', 'default': False}}}]}}, 'rebuild-index': {'type': 'boolean'}, 'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'type': 'array', 'items': {'type': 'string'}}, 'target': {'type': 'string', 'enum': ['gitlab'], 'default': 'gitlab'}, 'enable-artifacts-buildcache': {'type': 'boolean'}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'pipeline-gen': {'type': 'array', 'items': {'oneOf': [{'type': 'object', 'additionalProperties': False, 'required': ['submapping'], 'properties': {'match_behavior': {'type': 'string', 'enum': ['first', 'merge'], 'default': 'first'}, 'submapping': {'type': 'array', 'items': {'type': 'object', 'additionalProperties': False, 'required': ['match'], 'properties': {'match': {'type': 'array', 'items': {'type': 'string'}}, 'build-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'build-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}}}}, {'oneOf': [{'type': 'object', 'additionalProperties': False, 'properties': {'noop-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'noop-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'build-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'build-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'copy-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'copy-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'reindex-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'reindex-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'signing-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'signing-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'cleanup-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'cleanup-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}, {'type': 'object', 'additionalProperties': False, 'properties': {'any-job': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}, 'any-job-remove': {'type': 'object', 'additionalProperties': True, 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}, 'after_script': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}}}}}}]}]}}, 'bootstrap': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'object', 'additionalProperties': False, 'required': ['name'], 'properties': {'name': {'type': 'string'}, 'compiler-agnostic': {'type': 'boolean', 'default': False}}}]}}, 'rebuild-index': {'type': 'boolean'}, 'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'type': 'array', 'items': {'type': 'string'}}, 'target': {'type': 'string', 'enum': ['gitlab'], 'default': 'gitlab'}, 'temporary-storage-url-prefix': {'type': 'string'}}}]}, {'anyOf': [{'type': 'object', 'additionalProperties': False, 'required': ['mappings'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}, 'bootstrap': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'object', 'additionalProperties': False, 'required': ['name'], 'properties': {'name': {'type': 'string'}, 'compiler-agnostic': {'type': 'boolean', 'default': False}}}]}}, 'match_behavior': {'type': 'string', 'enum': ['first', 'merge'], 'default': 'first'}, 'mappings': {'type': 'array', 'items': {'type': 'object', 'additionalProperties': False, 'required': ['match'], 'properties': {'match': {'type': 'array', 'items': {'type': 'string'}}, 'remove-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'tags': {'type': 'array', 'items': {'type': 'string'}}}}, 'runner-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}}}}}}, 'service-job-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}}}, 'signing-job-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}}}, 'rebuild-index': {'type': 'boolean'}, 'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'type': 'array', 'items': {'type': 'string'}}, 'enable-artifacts-buildcache': {'type': 'boolean'}}}, {'type': 'object', 'additionalProperties': False, 'required': ['mappings'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}, 'bootstrap': {'type': 'array', 'items': {'anyOf': [{'type': 'string'}, {'type': 'object', 'additionalProperties': False, 'required': ['name'], 'properties': {'name': {'type': 'string'}, 'compiler-agnostic': {'type': 'boolean', 'default': False}}}]}}, 'match_behavior': {'type': 'string', 'enum': ['first', 'merge'], 'default': 'first'}, 'mappings': {'type': 'array', 'items': {'type': 'object', 'additionalProperties': False, 'required': ['match'], 'properties': {'match': {'type': 'array', 'items': {'type': 'string'}}, 'remove-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'tags': {'type': 'array', 'items': {'type': 'string'}}}}, 'runner-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}}}}}}, 'service-job-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}}}, 'signing-job-attributes': {'type': 'object', 'additionalProperties': False, 'required': ['tags'], 'properties': {'image': {'oneOf': [{'type': 'string'}, {'type': 'object', 'properties': {'name': {'type': 'string'}, 'entrypoint': {'type': 'array', 'items': {'type': 'string'}}}}]}, 'tags': {'type': 'array', 'items': {'type': 'string'}}, 'variables': {'type': 'object', 'patternProperties': {'[\\w\\d\\-_\\.]+': {'type': 'string'}}}, 'before_script': {'type': 'array', 'items': {'type': 'string'}}, 'script': {'type': 'array', 'items': {'type': 'string'}}, 'after_script': {'type': 'array', 'items': {'type': 'string'}}}}, 'rebuild-index': {'type': 'boolean'}, 'broken-specs-url': {'type': 'string'}, 'broken-tests-packages': {'type': 'array', 'items': {'type': 'string'}}, 'temporary-storage-url-prefix': {'type': 'string'}}}]}]}}, 'title': 'Spack CI configuration file schema', 'type': 'object'}, 'compilers': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'compilers': {'items': [{'type': 'object', 'additionalProperties': False, 'properties': {'compiler': {'type': 'object', 'additionalProperties': False, 'required': ['paths', 'spec', 'modules', 'operating_system'], 'properties': {'paths': {'type': 'object', 'required': ['cc', 'cxx', 'f77', 'fc'], 'additionalProperties': False, 'properties': {'cc': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cxx': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'f77': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'fc': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}}}, 'flags': {'type': 'object', 'additionalProperties': False, 'properties': {'cflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cxxflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'fflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'cppflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'ldflags': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'ldlibs': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}}}, 'spec': {'type': 'string'}, 'operating_system': {'type': 'string'}, 'target': {'type': 'string'}, 'alias': {'anyOf': [{'type': 'string'}, {'type': 'null'}]}, 'modules': {'anyOf': [{'type': 'string'}, {'type': 'null'}, {'type': 'array'}]}, 'implicit_rpaths': {'anyOf': [{'type': 'array', 'items': {'type': 'string'}}, {'type': 'boolean'}]}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}, 'extra_rpaths': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}}}], 'type': 'array'}}, 'title': 'Spack compiler configuration file schema', 'type': 'object'}, 'concretizer': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'concretizer': {'additionalProperties': False, 'properties': {'enable_node_namespace': {'type': 'boolean'}, 'reuse': {'oneOf': [{'type': 'boolean'}, {'type': 'string', 'enum': ['dependencies']}]}, 'targets': {'properties': {'granularity': {'enum': ['generic', 'microarchitectures'], 'type': 'string'}, 'host_compatible': {'type': 'boolean'}}, 'type': 'object'}, 'unify': {'oneOf': [{'type': 'boolean'}, {'type': 'string', 'enum': ['when_possible']}]}}, 'type': 'object'}}, 'title': 'Spack concretizer configuration file schema', 'type': 'object'}, 'config': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'config': {'default': {}, 'deprecatedProperties': {'error': False, 'message': 'config:module_roots has been replaced by modules:[module set]:roots and is ignored', 'properties': ['module_roots']}, 'properties': {'additional_external_search_paths': {'items': {'type': 'string'}, 'type': 'array'}, 'allow_sgid': {'type': 'boolean'}, 'binary_index_root': {'type': 'string'}, 'binary_index_ttl': {'minimum': 0, 'type': 'integer'}, 'build_jobs': {'minimum': 1, 'type': 'integer'}, 'build_language': {'type': 'string'}, 'build_stage': {'oneOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}, 'ccache': {'type': 'boolean'}, 'checksum': {'type': 'boolean'}, 'concretizer': {'enum': ['original', 'clingo'], 'type': 'string'}, 'connect_timeout': {'minimum': 0, 'type': 'integer'}, 'db_lock_timeout': {'minimum': 1, 'type': 'integer'}, 'debug': {'type': 'boolean'}, 'deprecated': {'type': 'boolean'}, 'dirty': {'type': 'boolean'}, 'environments_root': {'type': 'string'}, 'extensions': {'items': {'type': 'string'}, 'type': 'array'}, 'flags': {'properties': {'keep_werror': {'enum': ['all', 'specific', 'none'], 'type': 'string'}}, 'type': 'object'}, 'install_hash_length': {'minimum': 1, 'type': 'integer'}, 'install_missing_compilers': {'type': 'boolean'}, 'install_path_scheme': {'type': 'string'}, 'install_tree': {'anyOf': [{'type': 'object', 'properties': {'root': {'type': 'string'}, 'padded_length': {'oneOf': [{'type': 'integer', 'minimum': 0}, {'type': 'boolean'}]}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}}}, {'type': 'string'}]}, 'license_dir': {'type': 'string'}, 'locks': {'type': 'boolean'}, 'misc_cache': {'type': 'string'}, 'package_lock_timeout': {'anyOf': [{'type': 'integer', 'minimum': 1}, {'type': 'null'}]}, 'shared_linking': {'anyOf': [{'type': 'string', 'enum': ['rpath', 'runpath']}, {'type': 'object', 'properties': {'type': {'type': 'string', 'enum': ['rpath', 'runpath']}, 'bind': {'type': 'boolean'}}}]}, 'source_cache': {'type': 'string'}, 'stage_name': {'type': 'string'}, 'suppress_gpg_warnings': {'type': 'boolean'}, 'template_dirs': {'items': {'type': 'string'}, 'type': 'array'}, 'test_stage': {'type': 'string'}, 'url_fetch_method': {'enum': ['urllib', 'curl'], 'type': 'string'}, 'verify_ssl': {'type': 'boolean'}}, 'type': 'object'}}, 'title': 'Spack core configuration file schema', 'type': 'object'}, 'mirrors': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'mirrors': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'object', 'required': ['fetch', 'push'], 'properties': {'fetch': {'type': ['string', 'object']}, 'push': {'type': ['string', 'object']}}}]}}, 'type': 'object'}}, 'title': 'Spack mirror configuration file schema', 'type': 'object'}, 'modules': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'modules': {'additionalProperties': False, 'patternProperties': {'^(?!prefix_inspections$)\\w[\\w-]*$': {'additionalProperties': False, 'default': {}, 'properties': {'arch_folder': {'type': 'boolean'}, 'enable': {'default': [], 'items': {'enum': ['tcl', 'lmod'], 'type': 'string'}, 'type': 'array'}, 'lmod': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'include': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_env_vars': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|whitelist|blacklist|include|exclude|projections|naming_scheme|core_compilers|all)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_env_vars': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_env_vars': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {'type': 'object', 'properties': {'core_compilers': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'hierarchy': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'core_specs': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'filter_hierarchy_specs': {'type': 'object', 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|whitelist|blacklist|include|exclude|projections|naming_scheme|core_compilers|all)(^\\w[\\w-]*)': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}}}]}, 'prefix_inspections': {'additionalProperties': False, 'patternProperties': {'^[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'roots': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}, 'tcl': {'allOf': [{'type': 'object', 'default': {}, 'allOf': [{'properties': {'verbose': {'type': 'boolean', 'default': False}, 'hash_length': {'type': 'integer', 'minimum': 0, 'default': 7}, 'whitelist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'blacklist_implicits': {'type': 'boolean', 'default': False}, 'include': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_implicits': {'type': 'boolean', 'default': False}, 'defaults': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'naming_scheme': {'type': 'string'}, 'projections': {'type': 'object', 'patternProperties': {'all|\\w[\\w-]*': {'type': 'string'}}}, 'all': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_env_vars': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}, {'validate_spec': True, 'patternProperties': {'(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|whitelist|blacklist|include|exclude|projections|naming_scheme|core_compilers|all)(^\\w[\\w-]*)': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_env_vars': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}, '^[\\^@%+~]': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'filter': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'environment_blacklist': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'exclude_env_vars': {'type': 'array', 'default': [], 'items': {'type': 'string'}}}}, 'template': {'type': 'string'}, 'autoload': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'prerequisites': {'type': 'string', 'enum': ['none', 'direct', 'all']}, 'conflict': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'load': {'type': 'array', 'default': [], 'items': {'type': 'string'}}, 'suffixes': {'type': 'object', 'validate_spec': True, 'patternProperties': {'\\w[\\w-]*': {'type': 'string'}}}, 'environment': {'type': 'object', 'default': {}, 'additionalProperties': False, 'properties': {'set': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'unset': {'type': 'array', 'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}, 'prepend_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'append_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}, 'remove_path': {'type': 'object', 'patternProperties': {'\\w[\\w-]*': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}}}}}}}}}]}, {}]}, 'use_view': {'anyOf': [{'type': 'string'}, {'type': 'boolean'}]}}, 'type': 'object'}}, 'properties': {'prefix_inspections': {'additionalProperties': False, 'patternProperties': {'^[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack module file configuration file schema', 'type': 'object'}, 'packages': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'packages': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'additionalProperties': False, 'default': {}, 'properties': {'buildable': {'default': True, 'type': 'boolean'}, 'compiler': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'externals': {'items': {'additionalProperties': True, 'properties': {'extra_attributes': {'type': 'object'}, 'modules': {'items': {'type': 'string'}, 'type': 'array'}, 'prefix': {'type': 'string'}, 'spec': {'type': 'string'}}, 'required': ['spec'], 'type': 'object'}, 'type': 'array'}, 'package_attributes': {'additionalProperties': False, 'patternProperties': {'\\w+': {}}, 'type': 'object'}, 'permissions': {'additionalProperties': False, 'properties': {'group': {'type': 'string'}, 'read': {'enum': ['user', 'group', 'world'], 'type': 'string'}, 'write': {'enum': ['user', 'group', 'world'], 'type': 'string'}}, 'type': 'object'}, 'providers': {'additionalProperties': False, 'default': {}, 'patternProperties': {'\\w[\\w-]*': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'type': 'object'}, 'require': {'oneOf': [{'type': 'array', 'items': {'oneOf': [{'type': 'object', 'additionalProperties': False, 'properties': {'one_of': {'type': 'array', 'items': {'type': 'string'}}, 'any_of': {'type': 'array', 'items': {'type': 'string'}}, 'spec': {'type': 'string'}, 'message': {'type': 'string'}, 'when': {'type': 'string'}}}, {'type': 'string'}]}}, {'type': 'string'}]}, 'target': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}, 'variants': {'oneOf': [{'type': 'string'}, {'type': 'array', 'items': {'type': 'string'}}]}, 'version': {'default': [], 'items': {'anyOf': [{'type': 'string'}, {'type': 'number'}]}, 'type': 'array'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack package configuration file schema', 'type': 'object'}, 'repos': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'repos': {'default': [], 'items': {'type': 'string'}, 'type': 'array'}}, 'title': 'Spack repository configuration file schema', 'type': 'object'}, 'upstreams': {'$schema': 'http://json-schema.org/draft-07/schema#', 'additionalProperties': False, 'properties': {'upstreams': {'default': {}, 'patternProperties': {'\\w[\\w-]*': {'additionalProperties': False, 'default': {}, 'properties': {'install_tree': {'type': 'string'}, 'modules': {'properties': {'lmod': {'type': 'string'}, 'tcl': {'type': 'string'}}, 'type': 'object'}}, 'type': 'object'}}, 'type': 'object'}}, 'title': 'Spack core configuration file schema', 'type': 'object'}}
Dict from section names -> schema for that section
- spack.config.set(path, value, scope=None)[source]
Convenience function for setting single values in config files.
Accepts the path syntax described in
get()
.
- spack.config.use_configuration(*scopes_or_paths)[source]
Use the configuration scopes passed as arguments within the context manager.
- Parameters:
*scopes_or_paths – scope objects or paths to be used
- Returns:
Configuration object associated with the scopes passed as arguments
spack.cray_manifest module
- exception spack.cray_manifest.ManifestValidationError(msg, long_msg=None)[source]
Bases:
SpackError
- spack.cray_manifest.default_path = '/opt/cray/pe/cpe-descriptive-manifest/'
Cray systems can store a Spack-compatible description of system packages here.
- spack.cray_manifest.translated_compiler_name(manifest_compiler_name)[source]
When creating a Compiler object, Spack expects a name matching one of the classes in spack.compilers. Names in the Cray manifest may differ; for cases where we know the name refers to a compiler in Spack, this function translates it automatically.
This function will raise an error if there is no recorded translation and the name doesn’t match a known compiler name.
spack.database module
Spack’s installation tracking database.
The database serves two purposes:
It implements a cache on top of a potentially very large Spack directory hierarchy, speeding up many operations that would otherwise require filesystem access.
It will allow us to track external installations as well as lost packages and their dependencies.
Prior to the implementation of this store, a directory layout served as the authoritative database of packages in Spack. This module provides a cache and a sanity checking mechanism for what is in the filesystem.
- exception spack.database.CorruptDatabaseError(message, long_message=None)[source]
Bases:
SpackError
Raised when errors are found while reading the database.
- class spack.database.Database(root, db_dir=None, upstream_dbs=None, is_upstream=False, enable_transaction_locking=True, record_fields=['spec', 'ref_count', 'path', 'installed', 'explicit', 'installation_time', 'deprecated_for'])[source]
Bases:
object
Per-process lock objects for each install prefix.
- clear_failure(spec, force=False)[source]
Remove any persistent and cached failure tracking for the spec.
see mark_failed().
- Parameters:
spec (spack.spec.Spec) – the spec whose failure indicators are being removed
force (bool) – True if the failure information should be cleared when a prefix failure lock exists for the file or False if the failure should not be cleared (e.g., it may be associated with a concurrent build)
- get_by_hash(dag_hash, default=None, installed=<built-in function any>)[source]
Look up a spec by DAG hash, or by a DAG hash prefix.
- Parameters:
dag_hash (str) – hash (or hash prefix) to look up
default (object or None) – default value to return if dag_hash is not in the DB (default: None)
installed (bool or InstallStatus or Iterable or None) – if
True
, includes only installed specs in the search; ifFalse
only missing specs, and ifany
, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: any)
installed
defaults toany
so that we can refer to any known hash. Note thatquery()
andquery_one()
differ in that they only return installed specs by default.- Returns:
a list of specs matching the hash or hash prefix
- Return type:
(list)
- get_by_hash_local(*args, **kwargs)[source]
Look up a spec in this DB by DAG hash, or by a DAG hash prefix.
- Parameters:
dag_hash (str) – hash (or hash prefix) to look up
default (object or None) – default value to return if dag_hash is not in the DB (default: None)
installed (bool or InstallStatus or Iterable or None) – if
True
, includes only installed specs in the search; ifFalse
only missing specs, and ifany
, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: any)
installed
defaults toany
so that we can refer to any known hash. Note thatquery()
andquery_one()
differ in that they only return installed specs by default.- Returns:
a list of specs matching the hash or hash prefix
- Return type:
(list)
- mark_failed(spec)[source]
Mark a spec as failing to install.
Prefix failure marking takes the form of a byte range lock on the nth byte of a file for coordinating between concurrent parallel build processes and a persistent file, named with the full hash and containing the spec, in a subdirectory of the database to enable persistence across overlapping but separate related build processes.
The failure lock file,
spack.store.db.prefix_failures
, lives alongside the install DB.n
is the sys.maxsize-bit prefix of the associated DAG hash to make the likelihood of collision very low with no cleanup required.
- prefix_lock(spec, timeout=None)[source]
Get a lock on a particular spec’s installation directory.
NOTE: The installation directory does not need to exist.
Prefix lock is a byte range lock on the nth byte of a file.
The lock file is
spack.store.db.prefix_lock
– the DB tells us what to call it and it lives alongside the install DB.n is the sys.maxsize-bit prefix of the DAG hash. This makes likelihood of collision is very low AND it gives us readers-writer lock semantics with just a single lockfile, so no cleanup required.
- query(*args, **kwargs)[source]
Query the Spack database including all upstream databases.
- Parameters:
query_spec – queries iterate through specs in the database and return those that satisfy the supplied
query_spec
. If query_spec is any, This will match all specs in the database. If it is a spec, we’ll evaluatespec.satisfies(query_spec)
known (bool or None) – Specs that are “known” are those for which Spack can locate a
package.py
file – i.e., Spack “knows” how to install them. Specs that are unknown may represent packages that existed in a previous version of Spack, but have since either changed their name or been removedinstalled (bool or InstallStatus or Iterable or None) – if
True
, includes only installed specs in the search; ifFalse
only missing specs, and ifany
, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: True)explicit (bool or None) – A spec that was installed following a specific user request is marked as explicit. If instead it was pulled-in as a dependency of a user requested spec it’s considered implicit.
start_date (datetime.datetime or None) – filters the query discarding specs that have been installed before
start_date
.end_date (datetime.datetime or None) – filters the query discarding specs that have been installed after
end_date
.hashes (Container) – list or set of hashes that we can use to restrict the search
in_buildcache (bool or None) – Specs that are marked in this database as part of an associated binary cache are
in_buildcache
. All other specs are not. This field is used for querying mirror indices. Default isany
.
- Returns:
list of specs that match the query
- query_by_spec_hash(hash_key, data=None)[source]
Get a spec for hash, and whether it’s installed upstream.
- Returns:
- (bool, optional InstallRecord): bool tells us whether
the spec is installed upstream. Its InstallRecord is also returned if it’s installed at all; otherwise None.
- Return type:
(tuple)
- query_local(*args, **kwargs)[source]
Query only the local Spack database.
This function doesn’t guarantee any sorting of the returned data for performance reason, since comparing specs for __lt__ may be an expensive operation.
- Parameters:
query_spec – queries iterate through specs in the database and return those that satisfy the supplied
query_spec
. If query_spec is any, This will match all specs in the database. If it is a spec, we’ll evaluatespec.satisfies(query_spec)
known (bool or None) – Specs that are “known” are those for which Spack can locate a
package.py
file – i.e., Spack “knows” how to install them. Specs that are unknown may represent packages that existed in a previous version of Spack, but have since either changed their name or been removedinstalled (bool or InstallStatus or Iterable or None) – if
True
, includes only installed specs in the search; ifFalse
only missing specs, and ifany
, all specs in database. If an InstallStatus or iterable of InstallStatus, returns specs whose install status (installed, deprecated, or missing) matches (one of) the InstallStatus. (default: True)explicit (bool or None) – A spec that was installed following a specific user request is marked as explicit. If instead it was pulled-in as a dependency of a user requested spec it’s considered implicit.
start_date (datetime.datetime or None) – filters the query discarding specs that have been installed before
start_date
.end_date (datetime.datetime or None) – filters the query discarding specs that have been installed after
end_date
.hashes (Container) – list or set of hashes that we can use to restrict the search
in_buildcache (bool or None) – Specs that are marked in this database as part of an associated binary cache are
in_buildcache
. All other specs are not. This field is used for querying mirror indices. Default isany
.
- Returns:
list of specs that match the query
- query_local_by_spec_hash(hash_key)[source]
Get a spec by hash in the local database
- Returns:
- InstallRecord when installed
locally, otherwise None.
- Return type:
(InstallRecord or None)
- query_one(query_spec, known=<built-in function any>, installed=True)[source]
Query for exactly one spec that matches the query spec.
Raises an assertion error if more than one spec matches the query. Returns None if no installed package matches.
- reindex(directory_layout)[source]
Build database index from scratch based on a directory layout.
Locks the DB if it isn’t locked already.
- property unused_specs
Return all the specs that are currently installed but not needed at runtime to satisfy user’s requests.
- Specs in the return list are those which are not either:
Installed on an explicit user request
Installed as a “run” or “link” dependency (even transitive) of a spec at point 1.
- update_explicit(spec, explicit)[source]
Update the spec’s explicit state in the database.
- Parameters:
spec (spack.spec.Spec) – the spec whose install record is being updated
explicit (bool) –
True
if the package was requested explicitly by the user,False
if it was pulled in as a dependency of an explicit package.
- exception spack.database.ForbiddenLockError(message, long_message=None)[source]
Bases:
SpackError
Raised when an upstream DB attempts to acquire a lock
- class spack.database.InstallRecord(spec, path, installed, ref_count=0, explicit=False, installation_time=None, deprecated_for=None, in_buildcache=False, origin=None)[source]
Bases:
object
A record represents one installation in the DB.
The record keeps track of the spec for the installation, its install path, AND whether or not it is installed. We need the installed flag in case a user either:
blew away a directory, or
used spack uninstall -f to get rid of it
If, in either case, the package was removed but others still depend on it, we still need to track its spec, so we don’t actually remove from the database until a spec has no installed dependents left.
- Parameters:
spec (spack.spec.Spec) – spec tracked by the install record
path (str) – path where the spec has been installed
installed (bool) – whether or not the spec is currently installed
ref_count (int) – number of specs that depend on this one
explicit (bool or None) – whether or not this spec was explicitly installed, or pulled-in as a dependency of something else
installation_time (datetime.datetime or None) – time of the installation
- class spack.database.InstallStatuses[source]
Bases:
object
- DEPRECATED = 'deprecated'
- INSTALLED = 'installed'
- MISSING = 'missing'
- exception spack.database.InvalidDatabaseVersionError(database, expected, found)[source]
Bases:
SpackError
Exception raised when the database metadata is newer than current Spack.
- property database_version_message
- exception spack.database.MissingDependenciesError(message, long_message=None)[source]
Bases:
SpackError
Raised when DB cannot find records for dependencies
- exception spack.database.NonConcreteSpecAddError(message, long_message=None)[source]
Bases:
SpackError
Raised when attempting to add non-concrete spec to DB.
- exception spack.database.UpstreamDatabaseLockingError(message, long_message=None)[source]
Bases:
SpackError
Raised when an operation would need to lock an upstream database
spack.dependency module
Data structures that represent Spack’s dependency relationships.
- class spack.dependency.Dependency(pkg: PackageBase, spec: Spec, type: Tuple[str, ...] | None = ('build', 'link'))[source]
Bases:
object
Class representing metadata for a dependency on a package.
This class differs from
spack.spec.DependencySpec
because it represents metadata at thePackage
level.spack.spec.DependencySpec
is a descriptor for an actual package configuration, whileDependency
is a descriptor for a package’s dependency requirements.A dependency is a requirement for a configuration of another package that satisfies a particular spec. The dependency can have types, which determine how that package configuration is required, e.g. whether it is required for building the package, whether it needs to be linked to, or whether it is needed at runtime so that Spack can call commands from it.
A package can also depend on another package with patches. This is for cases where the maintainers of one package also maintain special patches for their dependencies. If one package depends on another with patches, a special version of that dependency with patches applied will be built for use by the dependent package. The patches are included in the new version’s spec hash to differentiate it from unpatched versions of the same package, so that unpatched versions of the dependency package can coexist with the patched version.
- merge(other: Dependency)[source]
Merge constraints, deptypes, and patches of other into self.
- spack.dependency.DependencyArgument
Type hint for the arguments accepting a dependency type
- spack.dependency.all_deptypes = ('build', 'link', 'run', 'test')
The types of dependency relationships that Spack understands.
- spack.dependency.canonical_deptype(deptype: str | List[str] | Tuple[str, ...]) Tuple[str, ...] [source]
Convert deptype to a canonical sorted tuple, or raise ValueError.
- Parameters:
deptype – string representing dependency type, or a list/tuple of such strings. Can also be the builtin function
all
or the string ‘all’, which result in a tuple of all dependency types known to Spack.
- spack.dependency.default_deptype = ('build', 'link')
Default dependency type if none is specified
- spack.dependency.deptype_chars(*type_tuples: str) str [source]
Create a string representing deptypes for many dependencies.
The string will be some subset of ‘blrt’, like ‘bl ‘, ‘b t’, or ‘ lr ‘ where each letter in ‘blrt’ stands for ‘build’, ‘link’, ‘run’, and ‘test’ (the dependency types).
For a single dependency, this just indicates that the dependency has the indicated deptypes. For a list of dependnecies, this shows whether ANY dpeendency in the list has the deptypes (so the deptypes are merged).
spack.directives module
This package contains directives that can be used within a package.
Directives are functions that can be called inside a package definition to modify the package, for example:
- class OpenMpi(Package):
depends_on(“hwloc”) provides(“mpi”) …
provides
and depends_on
are spack directives.
The available directives are:
build_system
conflicts
depends_on
extends
patch
provides
resource
variant
version
requires
- exception spack.directives.DirectiveError(message, long_message=None)[source]
Bases:
SpackError
This is raised when something is wrong with a package directive.
- class spack.directives.DirectiveMeta(name, bases, attr_dict)[source]
Bases:
type
Flushes the directives that were temporarily stored in the staging area into the package.
- static directive(dicts=None)[source]
Decorator for Spack directives.
Spack directives allow you to modify a package while it is being defined, e.g. to add version or dependency information. Directives are one of the key pieces of Spack’s package “language”, which is embedded in python.
Here’s an example directive:
@directive(dicts='versions') version(pkg, ...): ...
This directive allows you write:
class Foo(Package): version(...)
The
@directive
decorator handles a couple things for you:Adds the class scope (pkg) as an initial parameter when called, like a class method would. This allows you to modify a package from within a directive, while the package is still being defined.
It automatically adds a dictionary called “versions” to the package so that you can refer to pkg.versions.
The
(dicts='versions')
part ensures that ALL packages in Spack will have aversions
attribute after they’re constructed, and that if no directive actually modified it, it will just be an empty dict.This is just a modular way to add storage attributes to the Package class, and it’s how Spack gets information from the packages to the core.
- spack.directives.conflicts(conflict_spec, when=None, msg=None)[source]
Allows a package to define a conflict.
Currently, a “conflict” is a concretized configuration that is known to be non-valid. For example, a package that is known not to be buildable with intel compilers can declare:
conflicts('%intel')
To express the same constraint only when the ‘foo’ variant is activated:
conflicts('%intel', when='+foo')
- Parameters:
conflict_spec (spack.spec.Spec) – constraint defining the known conflict
when (spack.spec.Spec) – optional constraint that triggers the conflict
msg (str) – optional user defined message
- spack.directives.depends_on(spec, when=None, type=('build', 'link'), patches=None)[source]
Creates a dict of deps with specs defining when they apply.
- Parameters:
spec (spack.spec.Spec or str) – the package and constraints depended on
when (spack.spec.Spec or str) – when the dependent satisfies this, it has the dependency represented by
spec
patches (Callable or list) – single result of
patch()
directive, astr
to be passed topatch
, or a list of these
This directive is to be used inside a Package definition to declare that the package requires other packages to be built first. @see The section “Dependency specs” in the Spack Packaging Guide.
- spack.directives.extends(spec, type=('build', 'run'), **kwargs)[source]
Same as depends_on, but also adds this package to the extendee list.
keyword arguments can be passed to extends() so that extension packages can pass parameters to the extendee’s extension mechanism.
- spack.directives.maintainers(*names: str)[source]
Add a new maintainer directive, to specify maintainers in a declarative way.
- Parameters:
names – GitHub username for the maintainer
- spack.directives.patch(url_or_filename, level=1, when=None, working_dir='.', **kwargs)[source]
Packages can declare patches to apply to source. You can optionally provide a when spec to indicate that a particular patch should only be applied when the package’s spec meets certain conditions (e.g. a particular version).
- Parameters:
url_or_filename (str) – url or relative filename of the patch
level (int) – patch level (as in the patch shell command)
when (spack.spec.Spec) – optional anonymous spec that specifies when to apply the patch
working_dir (str) – dir to change to before applying
- Keyword Arguments:
- spack.directives.provides(*specs, **kwargs)[source]
Allows packages to provide a virtual dependency. If a package provides ‘mpi’, other packages can declare that they depend on “mpi”, and spack can use the providing package to satisfy the dependency.
- spack.directives.requires(*requirement_specs, policy='one_of', when=None, msg=None)[source]
Allows a package to request a configuration to be present in all valid solutions.
For instance, a package that is known to compile only with GCC can declare:
requires(“%gcc”)
A package that requires Apple-Clang on Darwin can declare instead:
requires(“%apple-clang”, when=”platform=darwin”, msg=”Apple Clang is required on Darwin”)
- Parameters:
requirement_specs – spec expressing the requirement
when – optional constraint that triggers the requirement. If None the requirement is applied unconditionally.
msg – optional user defined message
- spack.directives.resource(**kwargs)[source]
Define an external resource to be fetched and staged when building the package. Based on the keywords present in the dictionary the appropriate FetchStrategy will be used for the resource. Resources are fetched and staged in their own folder inside spack stage area, and then moved into the stage area of the package that needs them.
List of recognized keywords:
‘when’ : (optional) represents the condition upon which the resource is needed
‘destination’ : (optional) path where to move the resource. This path must be relative to the main package stage area.
‘placement’ : (optional) gives the possibility to fine tune how the resource is moved into the main package stage area.
- spack.directives.variant(name, default=None, description='', values=None, multi=None, validator=None, when=None, sticky=False)[source]
Define a variant for the package. Packager can specify a default value as well as a text description.
- Parameters:
name (str) – name of the variant
default (str or bool) – default value for the variant, if not specified otherwise the default will be False for a boolean variant and ‘nothing’ for a multi-valued variant
description (str) – description of the purpose of the variant
values (tuple or Callable) – either a tuple of strings containing the allowed values, or a callable accepting one value and returning True if it is valid
multi (bool) – if False only one value per spec is allowed for this variant
validator (Callable) – optional group validator to enforce additional logic. It receives the package name, the variant name and a tuple of values and should raise an instance of SpackError if the group doesn’t meet the additional constraints
when (spack.spec.Spec, bool) – optional condition on which the variant applies
sticky (bool) – the variant should not be changed by the concretizer to find a valid concrete spec.
- Raises:
DirectiveError – if arguments passed to the directive are invalid
- spack.directives.version(ver: str | int, checksum: str | None = None, *, preferred: bool | None = None, deprecated: bool | None = None, no_cache: bool | None = None, url: str | None = None, extension: str | None = None, expand: bool | None = None, fetch_options: dict | None = None, md5: str | None = None, sha1: str | None = None, sha224: str | None = None, sha256: str | None = None, sha384: str | None = None, sha512: str | None = None, git: str | None = None, commit: str | None = None, tag: str | None = None, branch: str | None = None, get_full_repo: bool | None = None, submodules: bool | None = None, submodules_delete: bool | None = None, svn: str | None = None, hg: str | None = None, cvs: str | None = None, revision: str | None = None, date: str | None = None)[source]
Adds a version and, if appropriate, metadata for fetching its code.
The
version
directives are aggregated into aversions
dictionary attribute withVersion
keys and metadata values, where the metadata is stored as a dictionary ofkwargs
.The (keyword) arguments are turned into a valid fetch strategy for code packages later. See
spack.fetch_strategy.for_package_version()
.
spack.directory_layout module
- class spack.directory_layout.DirectoryLayout(root, **kwargs)[source]
Bases:
object
A directory layout is used to associate unique paths with specs. Different installations are going to want different layouts for their install, and they can use this to customize the nesting structure of spack installs. The default layout is:
<install root>/
<platform-os-target>/
<compiler>-<compiler version>/
<name>-<version>-<hash>
The hash here is a SHA-1 hash for the full DAG plus the build spec.
The installation directory projections can be modified with the projections argument.
- deprecated_file_path(deprecated_spec, deprecator_spec=None)[source]
Gets full path to spec file for deprecated spec
If the deprecator_spec is provided, use that. Otherwise, assume deprecated_spec is already deprecated and its prefix links to the prefix of its deprecator.
- ensure_installed(spec)[source]
Throws InconsistentInstallDirectoryError if: 1. spec prefix does not exist 2. spec prefix does not contain a spec file, or 3. We read a spec with the wrong DAG hash out of an existing install directory.
- remove_install_directory(spec, deprecated=False)[source]
Removes a prefix and any empty parent directories from the root. Raised RemoveFailedError if something goes wrong.
- exception spack.directory_layout.DirectoryLayoutError(message, long_msg=None)[source]
Bases:
SpackError
Superclass for directory layout errors.
- exception spack.directory_layout.ExtensionAlreadyInstalledError(spec, ext_spec)[source]
Bases:
DirectoryLayoutError
Raised when an extension is added to a package that already has it.
- exception spack.directory_layout.ExtensionConflictError(spec, ext_spec, conflict)[source]
Bases:
DirectoryLayoutError
Raised when an extension is added to a package that already has it.
- exception spack.directory_layout.InconsistentInstallDirectoryError(message, long_msg=None)[source]
Bases:
DirectoryLayoutError
Raised when a package seems to be installed to the wrong place.
- exception spack.directory_layout.InvalidDirectoryLayoutParametersError(message, long_msg=None)[source]
Bases:
DirectoryLayoutError
Raised when a invalid directory layout parameters are supplied
- exception spack.directory_layout.InvalidExtensionSpecError(message, long_msg=None)[source]
Bases:
DirectoryLayoutError
Raised when an extension file has a bad spec in it.
- exception spack.directory_layout.RemoveFailedError(installed_spec, prefix, error)[source]
Bases:
DirectoryLayoutError
Raised when a DirectoryLayout cannot remove an install prefix.
- exception spack.directory_layout.SpecReadError(message, long_msg=None)[source]
Bases:
DirectoryLayoutError
Raised when directory layout can’t read a spec.
spack.error module
- exception spack.error.NoHeadersError(message, long_message=None)[source]
Bases:
SpackError
Raised when package headers are requested but cannot be found
- exception spack.error.NoLibrariesError(message_or_name, prefix=None)[source]
Bases:
SpackError
Raised when package libraries are requested but cannot be found
- exception spack.error.SpackError(message, long_message=None)[source]
Bases:
Exception
This is the superclass for all Spack errors. Subclasses can be found in the modules they have to do with.
- property long_message
- print_context()[source]
Print extended debug information about this exception.
This is usually printed when the top-level Spack error handler calls
die()
, but it can be called separately beforehand if a lower-level error handler needs to print error context and continue without raising the exception to the top level.
- exception spack.error.SpecError(message, long_message=None)[source]
Bases:
SpackError
Superclass for all errors that occur while constructing specs.
- exception spack.error.UnsatisfiableSpecError(provided, required, constraint_type)[source]
Bases:
SpecError
Raised when a spec conflicts with package constraints.
For original concretizer, provide the requirement that was violated when raising.
- exception spack.error.UnsupportedPlatformError(message)[source]
Bases:
SpackError
Raised by packages when a platform is not supported
- spack.error.debug = 0
at what level we should write stack traces or short error messages this is module-scoped because it needs to be set very early
spack.extensions module
Service functions and classes to implement the hooks for Spack’s command extensions.
- exception spack.extensions.CommandNotFoundError(cmd_name)[source]
Bases:
SpackError
Exception class thrown when a requested command is not recognized as such.
- exception spack.extensions.ExtensionNamingError(path)[source]
Bases:
SpackError
Exception class thrown when a configured extension does not follow the expected naming convention.
- spack.extensions.extension_name(path)[source]
Returns the name of the extension in the path passed as argument.
- Parameters:
path (str) – path where the extension resides
- Returns:
The extension name.
- Raises:
ExtensionNamingError – if path does not match the expected format for a Spack command extension.
- spack.extensions.get_command_paths()[source]
Return the list of paths where to search for command files.
- spack.extensions.get_extension_paths()[source]
Return the list of canonicalized extension paths from config:extensions.
- spack.extensions.get_module(cmd_name)[source]
Imports the extension module for a particular command name and returns it.
- Parameters:
cmd_name (str) – name of the command for which to get a module (contains
-
, not_
).
- spack.extensions.get_template_dirs()[source]
Returns the list of directories where to search for templates in extensions.
spack.fetch_strategy module
Fetch strategies are used to download source code into a staging area in order to build it. They need to define the following methods:
- fetch()
This should attempt to download/check out source from somewhere.
- check()
Apply a checksum to the downloaded source code, e.g. for an archive. May not do anything if the fetch method was safe to begin with.
- expand()
Expand (e.g., an archive) downloaded file to source, with the standard stage source path as the destination directory.
- reset()
Restore original state of downloaded code. Used by clean commands. This may just remove the expanded source and re-expand an archive, or it may run something like git reset –hard.
- archive()
Archive a source directory, e.g. for creating a mirror.
- class spack.fetch_strategy.BundleFetchStrategy(**kwargs)[source]
Bases:
FetchStrategy
Fetch strategy associated with bundle, or no-code, packages.
Having a basic fetch strategy is a requirement for executing post-install hooks. Consequently, this class provides the API but does little more than log messages.
TODO: Remove this class by refactoring resource handling and the link between composite stages and composite fetch strategies (see #11981).
- property cachable
Report False as there is no code to cache.
- class spack.fetch_strategy.CacheURLFetchStrategy(url=None, checksum=None, **kwargs)[source]
Bases:
URLFetchStrategy
The resource associated with a cache URL may be out of date.
- exception spack.fetch_strategy.ChecksumError(message, long_message=None)[source]
Bases:
FetchError
Raised when archive fails to checksum.
- class spack.fetch_strategy.CvsFetchStrategy(**kwargs)[source]
Bases:
VCSFetchStrategy
- Fetch strategy that gets source code from a CVS repository.
Use like this in a package:
- version(‘name’,
cvs=’:pserver:anonymous@www.example.com:/cvsroot%module=modulename’)
Optionally, you can provide a branch and/or a date for the URL:
- version(‘name’,
cvs=’:pserver:anonymous@www.example.com:/cvsroot%module=modulename’, branch=’branchname’, date=’date’)
Repositories are checked out into the standard stage source path directory.
- archive(destination)[source]
Create an archive of the downloaded data for a mirror.
For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.
- property cachable
Whether fetcher is capable of caching the resource it retrieves.
This generally is determined by whether the resource is identifiably associated with a specific package version.
- Returns:
True if can cache, False otherwise.
- Return type:
- property cvs
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- mirror_id()[source]
This is a unique ID for a source that is intended to help identify reuse of resources across packages.
It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.
- reset()[source]
Revert to freshly downloaded state.
For archive files, this may just re-expand the archive.
- exception spack.fetch_strategy.ExtrapolationError(message, long_message=None)[source]
Bases:
FetchError
Raised when we can’t extrapolate a version for a package.
- exception spack.fetch_strategy.FailedDownloadError(url, msg='')[source]
Bases:
FetchError
Raised when a download fails.
- class spack.fetch_strategy.FetchStrategy(**kwargs)[source]
Bases:
object
Superclass of all fetch strategies.
- archive(destination)[source]
Create an archive of the downloaded data for a mirror.
For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.
- property cachable
Whether fetcher is capable of caching the resource it retrieves.
This generally is determined by whether the resource is identifiably associated with a specific package version.
- Returns:
True if can cache, False otherwise.
- Return type:
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- classmethod matches(args)[source]
Predicate that matches fetch strategies to arguments of the version directive.
- Parameters:
args – arguments of the version directive
- mirror_id()[source]
This is a unique ID for a source that is intended to help identify reuse of resources across packages.
It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.
- reset()[source]
Revert to freshly downloaded state.
For archive files, this may just re-expand the archive.
- class spack.fetch_strategy.FetchStrategyComposite[source]
Bases:
Composite
Composite for a FetchStrategy object.
- classmethod matches(args)
Predicate that matches fetch strategies to arguments of the version directive.
- Parameters:
args – arguments of the version directive
- exception spack.fetch_strategy.FetcherConflict(message, long_message=None)[source]
Bases:
FetchError
Raised for packages with invalid fetch attributes.
- class spack.fetch_strategy.GCSFetchStrategy(*args, **kwargs)[source]
Bases:
URLFetchStrategy
FetchStrategy that pulls from a GCS bucket.
- class spack.fetch_strategy.GitFetchStrategy(**kwargs)[source]
Bases:
VCSFetchStrategy
Fetch strategy that gets source code from a git repository. Use like this in a package:
version(‘name’, git=’https://github.com/project/repo.git’)
Optionally, you can provide a branch, or commit to check out, e.g.:
version(‘1.1’, git=’https://github.com/project/repo.git’, tag=’v1.1’)
You can use these three optional attributes in addition to
git
:branch
: Particular branch to build from (default is therepository’s default branch)
tag
: Particular tag to check outcommit
: Particular commit hash in the repo
Repositories are cloned into the standard stage source path directory.
- archive(destination)[source]
Create an archive of the downloaded data for a mirror.
For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.
- property cachable
Whether fetcher is capable of caching the resource it retrieves.
This generally is determined by whether the resource is identifiably associated with a specific package version.
- Returns:
True if can cache, False otherwise.
- Return type:
- clone(dest=None, commit=None, branch=None, tag=None, bare=False)[source]
Clone a repository to a path.
This method handles cloning from git, but does not require a stage.
- Parameters:
dest (str or None) – The path into which the code is cloned. If None, requires a stage and uses the stage’s source path.
commit (str or None) – A commit to fetch from the remote. Only one of commit, branch, and tag may be non-None.
branch (str or None) – A branch to fetch from the remote.
tag (str or None) – A tag to fetch from the remote.
bare (bool) – Execute a “bare” git clone (–bare option to git)
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- property git
- property git_version
- git_version_re = 'git version (\\S+)'
- mirror_id()[source]
This is a unique ID for a source that is intended to help identify reuse of resources across packages.
It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.
- optional_attrs: List[str] = ['tag', 'branch', 'commit', 'submodules', 'get_full_repo', 'submodules_delete']
- protocol_supports_shallow_clone()[source]
Shallow clone operations (–depth #) are not supported by the basic HTTP protocol or by no-protocol file specifications. Use (e.g.) https:// or file:// instead.
- reset()[source]
Revert to freshly downloaded state.
For archive files, this may just re-expand the archive.
- source_id()[source]
A unique ID for the source.
It is intended that a human could easily generate this themselves using the information available to them in the Spack package.
The returned value is added to the content which determines the full hash for a package using str().
- class spack.fetch_strategy.GoFetchStrategy(**kwargs)[source]
Bases:
VCSFetchStrategy
Fetch strategy that employs the go get infrastructure.
Use like this in a package:
- version(‘name’,
go=’github.com/monochromegane/the_platinum_searcher/…’)
Go get does not natively support versions, they can be faked with git.
The fetched source will be moved to the standard stage sourcepath directory during the expand step.
- archive(destination)[source]
Create an archive of the downloaded data for a mirror.
For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- property go
- property go_version
- class spack.fetch_strategy.HgFetchStrategy(**kwargs)[source]
Bases:
VCSFetchStrategy
Fetch strategy that gets source code from a Mercurial repository. Use like this in a package:
version(‘name’, hg=’https://jay.grs.rwth-aachen.de/hg/lwm2’)
Optionally, you can provide a branch, or revision to check out, e.g.:
- version(‘torus’,
hg=’https://jay.grs.rwth-aachen.de/hg/lwm2’, branch=’torus’)
You can use the optional ‘revision’ attribute to check out a branch, tag, or particular revision in hg. To prevent non-reproducible builds, using a moving target like a branch is discouraged.
revision
: Particular revision, branch, or tag.
Repositories are cloned into the standard stage source path directory.
- archive(destination)[source]
Create an archive of the downloaded data for a mirror.
For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.
- property cachable
Whether fetcher is capable of caching the resource it retrieves.
This generally is determined by whether the resource is identifiably associated with a specific package version.
- Returns:
True if can cache, False otherwise.
- Return type:
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- property hg
Returns: Executable: the hg executable
- mirror_id()[source]
This is a unique ID for a source that is intended to help identify reuse of resources across packages.
It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.
- reset()[source]
Revert to freshly downloaded state.
For archive files, this may just re-expand the archive.
- exception spack.fetch_strategy.InvalidArgsError(pkg=None, version=None, **args)[source]
Bases:
FetchError
Raised when a version can’t be deduced from a set of arguments.
- exception spack.fetch_strategy.NoArchiveFileError(message, long_message=None)[source]
Bases:
FetchError
Raised when an archive file is expected but none exists.
- exception spack.fetch_strategy.NoCacheError(message, long_message=None)[source]
Bases:
FetchError
Raised when there is no cached archive for a package.
- exception spack.fetch_strategy.NoDigestError(message, long_message=None)[source]
Bases:
FetchError
Raised after attempt to checksum when URL has no digest.
- exception spack.fetch_strategy.NoStageError(method)[source]
Bases:
FetchError
Raised when fetch operations are called before set_stage().
- class spack.fetch_strategy.S3FetchStrategy(*args, **kwargs)[source]
Bases:
URLFetchStrategy
FetchStrategy that pulls from an S3 bucket.
- class spack.fetch_strategy.SvnFetchStrategy(**kwargs)[source]
Bases:
VCSFetchStrategy
- Fetch strategy that gets source code from a subversion repository.
Use like this in a package:
version(‘name’, svn=’http://www.example.com/svn/trunk’)
Optionally, you can provide a revision for the URL:
- version(‘name’, svn=’http://www.example.com/svn/trunk’,
revision=’1641’)
Repositories are checked out into the standard stage source path directory.
- archive(destination)[source]
Create an archive of the downloaded data for a mirror.
For downloaded files, this should preserve the checksum of the original file. For repositories, it should just create an expandable tarball out of the downloaded repository.
- property cachable
Whether fetcher is capable of caching the resource it retrieves.
This generally is determined by whether the resource is identifiably associated with a specific package version.
- Returns:
True if can cache, False otherwise.
- Return type:
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- mirror_id()[source]
This is a unique ID for a source that is intended to help identify reuse of resources across packages.
It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.
- reset()[source]
Revert to freshly downloaded state.
For archive files, this may just re-expand the archive.
- source_id()[source]
A unique ID for the source.
It is intended that a human could easily generate this themselves using the information available to them in the Spack package.
The returned value is added to the content which determines the full hash for a package using str().
- property svn
- class spack.fetch_strategy.URLFetchStrategy(url=None, checksum=None, **kwargs)[source]
Bases:
FetchStrategy
URLFetchStrategy pulls source code from a URL for an archive, check the archive against a checksum, and decompresses the archive.
The destination for the resulting file(s) is the standard stage path.
- property archive_file
Path to the source archive within this stage directory.
- property cachable
Whether fetcher is capable of caching the resource it retrieves.
This generally is determined by whether the resource is identifiably associated with a specific package version.
- Returns:
True if can cache, False otherwise.
- Return type:
- property candidate_urls
- check()[source]
Check the downloaded archive against a checksum digest. No-op if this stage checks code out of a repository.
- property curl
- fetch()[source]
Fetch source code archive or repo.
- Returns:
True on success, False on failure.
- Return type:
- mirror_id()[source]
This is a unique ID for a source that is intended to help identify reuse of resources across packages.
It is unique like source-id, but it does not include the package name and is not necessarily easy for a human to create themselves.
- class spack.fetch_strategy.VCSFetchStrategy(**kwargs)[source]
Bases:
FetchStrategy
Superclass for version control system fetch strategies.
Like all fetchers, VCS fetchers are identified by the attributes passed to the
version
directive. The optional_attrs for a VCS fetch strategy represent types of revisions, e.g. tags, branches, commits, etc.The required attributes (git, svn, etc.) are used to specify the URL and to distinguish a VCS fetch strategy from a URL fetch strategy.
- spack.fetch_strategy.all_strategies = [<class 'spack.fetch_strategy.BundleFetchStrategy'>, <class 'spack.fetch_strategy.URLFetchStrategy'>, <class 'spack.fetch_strategy.CacheURLFetchStrategy'>, <class 'spack.fetch_strategy.GoFetchStrategy'>, <class 'spack.fetch_strategy.GitFetchStrategy'>, <class 'spack.fetch_strategy.CvsFetchStrategy'>, <class 'spack.fetch_strategy.SvnFetchStrategy'>, <class 'spack.fetch_strategy.HgFetchStrategy'>, <class 'spack.fetch_strategy.S3FetchStrategy'>, <class 'spack.fetch_strategy.GCSFetchStrategy'>]
List of all fetch strategies, created by FetchStrategy metaclass.
- spack.fetch_strategy.check_pkg_attributes(pkg)[source]
Find ambiguous top-level fetch attributes in a package.
Currently this only ensures that two or more VCS fetch strategies are not specified at once.
- spack.fetch_strategy.for_package_version(pkg, version=None)[source]
Determine a fetch strategy based on the arguments supplied to version() in the package description.
- spack.fetch_strategy.from_kwargs(**kwargs)[source]
Construct an appropriate FetchStrategy from the given keyword arguments.
- Parameters:
**kwargs – dictionary of keyword arguments, e.g. from a
version()
directive in a package.- Returns:
- The fetch strategy that matches the args, based
on attribute names (e.g.,
git
,hg
, etc.)
- Return type:
- Raises:
spack.util.web.FetchError – If no
fetch_strategy
matches the args.
- spack.fetch_strategy.from_list_url(pkg)[source]
If a package provides a URL which lists URLs for resources by version, this can can create a fetcher for a URL discovered for the specified package’s version.
- spack.fetch_strategy.from_url(url)[source]
Given a URL, find an appropriate fetch strategy for it. Currently just gives you a URLFetchStrategy that uses curl.
- TODO: make this return appropriate fetch strategies for other
types of URLs.
- spack.fetch_strategy.from_url_scheme(url, *args, **kwargs)[source]
Finds a suitable FetchStrategy by matching its url_attr with the scheme in the given url.
spack.filesystem_view module
- class spack.filesystem_view.FilesystemView(root, layout, **kwargs)[source]
Bases:
object
Governs a filesystem view that is located at certain root-directory.
Packages are linked from their install directories into a common file hierachy.
In distributed filesystems, loading each installed package seperately can lead to slow-downs due to too many directories being traversed. This can be circumvented by loading all needed modules into a common directory structure.
- add_specs(*specs, **kwargs)[source]
Add given specs to view.
Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be activated as well.
Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.
This method should make use of activate_standalone.
- get_spec(spec)[source]
Return the actual spec linked in this view (i.e. do not look it up in the database by name).
spec can be a name or a spec from which the name is extracted.
As there can only be a single version active for any spec the name is enough to identify the spec in the view.
If no spec is present, returns None.
- print_status(*specs, **kwargs)[source]
- Print a short summary about the given specs, detailing whether..
..they are active in the view.
..they are active but the activated version differs.
..they are not activte in the view.
Takes with_dependencies keyword argument so that the status of dependencies is printed as well.
- remove_specs(*specs, **kwargs)[source]
Removes given specs from view.
Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be deactivated as well.
Should accept with_dependents as keyword argument (default True) to indicate wether or not dependents on the deactivated specs should be removed as well.
Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.
This method should make use of deactivate_standalone.
- class spack.filesystem_view.YamlFilesystemView(root, layout, **kwargs)[source]
Bases:
FilesystemView
Filesystem view to work with a yaml based directory layout.
- add_specs(*specs, **kwargs)[source]
Add given specs to view.
Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be activated as well.
Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.
This method should make use of activate_standalone.
- get_conflicts(*specs)[source]
Return list of tuples (<spec>, <spec in view>) where the spec active in the view differs from the one to be activated.
- get_projection_for_spec(spec)[source]
Return the projection for a spec in this view.
Relies on the ordering of projections to avoid ambiguity.
- get_spec(spec)[source]
Return the actual spec linked in this view (i.e. do not look it up in the database by name).
spec can be a name or a spec from which the name is extracted.
As there can only be a single version active for any spec the name is enough to identify the spec in the view.
If no spec is present, returns None.
- print_conflict(spec_active, spec_specified, level='error')[source]
Singular print function for spec conflicts.
- print_status(*specs, **kwargs)[source]
- Print a short summary about the given specs, detailing whether..
..they are active in the view.
..they are active but the activated version differs.
..they are not activte in the view.
Takes with_dependencies keyword argument so that the status of dependencies is printed as well.
- remove_specs(*specs, **kwargs)[source]
Removes given specs from view.
Should accept with_dependencies as keyword argument (default True) to indicate wether or not dependencies should be deactivated as well.
Should accept with_dependents as keyword argument (default True) to indicate wether or not dependents on the deactivated specs should be removed as well.
Should except an exclude keyword argument containing a list of regexps that filter out matching spec names.
This method should make use of deactivate_standalone.
spack.gcs_handler module
- class spack.gcs_handler.GCSHandler[source]
Bases:
BaseHandler
spack.graph module
Functions for graphing DAGs of dependencies.
This file contains code for graphing DAGs of software packages (i.e. Spack specs). There are two main functions you probably care about:
graph_ascii() will output a colored graph of a spec in ascii format, kind of like the graph git shows with “git log –graph”, e.g.:
o mpileaks
|\
| |\
| o | callpath
|/| |
| |\|
| |\ \
| | |\ \
| | | | o adept-utils
| |_|_|/|
|/| | | |
o | | | | mpi
/ / / /
| | o | dyninst
| |/| |
|/|/| |
| | |/
| o | libdwarf
|/ /
o | libelf
/
o boost
graph_dot() will output a graph of a spec (or multiple specs) in dot format.
- class spack.graph.AsciiGraph[source]
Bases:
object
- write(spec, color=None, out=None)[source]
Write out an ascii graph of the provided spec.
Arguments: spec – spec to graph. This only handles one spec at a time.
Optional arguments:
out – file object to write out to (default is sys.stdout)
- color – whether to write in color. Default is to autodetect
based on output file.
- class spack.graph.DAGWithDependencyTypes[source]
Bases:
DotGraphBuilder
DOT graph with link,run nodes grouped together and edges colored according to the dependency types.
- class spack.graph.DotGraphBuilder[source]
Bases:
object
Visit edges of a graph a build DOT options for nodes and edges
- class spack.graph.SimpleDAG[source]
Bases:
DotGraphBuilder
Simple DOT graph, with nodes colored uniformly and edges without properties
- class spack.graph.StaticDag[source]
Bases:
DotGraphBuilder
DOT graph for possible dependencies
- spack.graph.find(seq, predicate)[source]
Find index in seq for which predicate is True.
Searches the sequence and returns the index of the element for which the predicate evaluates to True. Returns -1 if the predicate does not evaluate to True for any element in seq.
- spack.graph.graph_ascii(spec, node='o', out=None, debug=False, indent=0, color=None, deptype='all')[source]
- spack.graph.graph_dot(specs: List[Spec], builder: DotGraphBuilder | None = None, deptype: str | List[str] | Tuple[str, ...] = 'all', out: TextIO | None = None)[source]
DOT graph of the concrete specs passed as input.
- Parameters:
specs – specs to be represented
builder – builder to use to render the graph
deptype – dependency types to consider
out – optional output stream. If None sys.stdout is used
- spack.graph.static_graph_dot(specs: List[Spec], deptype: str | Tuple[str, ...] | None = 'all', out: TextIO | None = None)[source]
Static DOT graph with edges to all possible dependencies.
- Parameters:
specs – abstract specs to be represented
deptype – dependency types to consider
out – optional output stream. If None sys.stdout is used
spack.hash_types module
Definitions that control how Spack creates Spec hashes.
- class spack.hash_types.SpecHashDescriptor(deptype, package_hash, name, override=None)[source]
Bases:
object
This class defines how hashes are generated on Spec objects.
Spec hashes in Spack are generated from a serialized (e.g., with YAML) representation of the Spec graph. The representation may only include certain dependency types, and it may optionally include a canonicalized hash of the package.py for each node in the graph.
We currently use different hashes for different use cases.
- property attr
Private attribute stored on spec
- spack.hash_types.dag_hash = <spack.hash_types.SpecHashDescriptor object>
Spack’s deployment hash. Includes all inputs that can affect how a package is built.
- spack.hash_types.package_hash = <spack.hash_types.SpecHashDescriptor object>
Package hash used as part of dag hash
- spack.hash_types.process_hash = <spack.hash_types.SpecHashDescriptor object>
Hash descriptor used only to transfer a DAG, as is, across processes
spack.install_test module
- class spack.install_test.PackageTest(pkg: Pb)[source]
Bases:
object
The class that manages stand-alone (post-install) package tests.
- phase_tests(builder: Builder, phase_name: str, method_names: List[str])[source]
Execute the builder’s package phase-time tests.
- Parameters:
builder – builder for package being tested
phase_name – the name of the build-time phase (e.g.,
build
,install
)method_names – phase-specific callback method names
- stand_alone_tests(kwargs)[source]
Run the package’s stand-alone tests.
- Parameters:
kwargs (dict) – arguments to be used by the test process
- status(name: str, status: TestStatus, msg: str | None = None)[source]
Track and print the test status for the test part name.
- exception spack.install_test.SkipTest[source]
Bases:
Exception
Raised when a test (part) is being skipped.
- exception spack.install_test.TestFailure(failures: List[Tuple[BaseException, str]])[source]
Bases:
SpackError
Raised when package tests have failed for an installation.
- spack.install_test.TestFailureType
Stand-alone test failure info type
alias of
Tuple
[BaseException
,str
]
- class spack.install_test.TestStatus(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]
Bases:
Enum
Names of different stand-alone test states.
- FAILED = 1
- NO_TESTS = -1
- PASSED = 2
- SKIPPED = 0
- class spack.install_test.TestSuite(specs, alias=None)[source]
Bases:
object
The class that manages specs for
spack test run
execution.- property content_hash
The hash used to uniquely identify the test suite.
- property current_test_cache_dir
Path to the test stage directory where the current spec’s cached build-time files were automatically copied.
- Returns:
path to the current spec’s staged, cached build-time files.
- Return type:
- Raises:
TestSuiteSpecError – If there is no spec being tested
- property current_test_data_dir
Path to the test stage directory where the current spec’s custom package (data) files were automatically copied.
- Returns:
path to the current spec’s staged, custom package (data) files
- Return type:
- Raises:
TestSuiteSpecError – If there is no spec being tested
- static from_dict(d)[source]
Instantiates a TestSuite based on a dictionary specs and an optional alias:
specs: list of the test suite’s specs in dictionary form alias: the test suite alias
- Returns:
Instance created from the specs
- Return type:
- static from_file(filename)[source]
Instantiate a TestSuite using the specs and optional alias provided in the given file.
- Parameters:
filename (str) – The path to the JSON file containing the test suite specs and optional alias.
- Raises:
BaseException – sjson.SpackJSONError if problem parsing the file
- log_file_for_spec(spec)[source]
The test log file path for the provided spec.
- Parameters:
spec (spack.spec.Spec) – instance of the spec under test
- Returns:
the path to the spec’s log file
- Return type:
- property name
The name (alias or, if none, hash) of the test suite.
- property results_file
The path to the results summary file.
- property stage
The root test suite stage directory.
- Returns:
the spec’s test stage directory path
- Return type:
- test_dir_for_spec(spec)[source]
The path to the test stage directory for the provided spec.
- Parameters:
spec (spack.spec.Spec) – instance of the spec under test
- Returns:
the spec’s test stage directory path
- Return type:
- classmethod test_log_name(spec)[source]
The standard log filename for a spec.
- Parameters:
spec (spack.spec.Spec) – instance of the spec under test
- Returns:
the spec’s log filename
- Return type:
- classmethod test_pkg_id(spec)[source]
The standard install test package identifier.
- Parameters:
spec – instance of the spec under test
- Returns:
the install test package identifier
- Return type:
- test_status(spec: Spec, externals: bool) TestStatus | None [source]
Determine the overall test results status for the spec.
- Parameters:
spec – instance of the spec under test
externals –
True
if externals are to be tested, elseFalse
- Returns:
the spec’s test status if available or
None
- tested_file_for_spec(spec)[source]
The test status file path for the spec.
- Parameters:
spec (spack.spec.Spec) – instance of the spec under test
- Returns:
the spec’s test status file path
- Return type:
- classmethod tested_file_name(spec)[source]
The standard test status filename for the spec.
- Parameters:
spec (spack.spec.Spec) – instance of the spec under test
- Returns:
the spec’s test status filename
- Return type:
- to_dict()[source]
Build a dictionary for the test suite.
- Returns:
The dictionary contains entries for up to two keys:
specs: list of the test suite’s specs in dictionary form alias: the alias, or name, given to the test suite if provided
- Return type:
- write_test_result(spec, result)[source]
Write the spec’s test result to the test suite results file.
- Parameters:
spec (spack.spec.Spec) – instance of the spec under test
result (str) – result from the spec’s test execution (e.g, PASSED)
- exception spack.install_test.TestSuiteError(message, long_message=None)[source]
Bases:
SpackError
Raised when there is an error with the test suite.
- exception spack.install_test.TestSuiteFailure(num_failures)[source]
Bases:
SpackError
Raised when one or more tests in a suite have failed.
- exception spack.install_test.TestSuiteNameError(message, long_message=None)[source]
Bases:
SpackError
Raised when there is an issue with the naming of the test suite.
- exception spack.install_test.TestSuiteSpecError(message, long_message=None)[source]
Bases:
SpackError
Raised when there is an issue associated with the spec being tested.
- spack.install_test.cache_extra_test_sources(pkg: Pb, srcs: str | List[str])[source]
Copy relative source paths to the corresponding install test subdir
This routine is intended as an optional install test setup helper for grabbing source files/directories during the installation process and copying them to the installation test subdirectory for subsequent use during install testing.
- Parameters:
pkg – package being tested
srcs – relative path for file(s) and or subdirectory(ies) located in the staged source path that are to be copied to the corresponding location(s) under the install testing directory.
- Raises:
spack.installer.InstallError – if any of the source paths are absolute or do not exist under the build stage
- spack.install_test.check_outputs(expected: list | set | str, actual: str)[source]
Ensure the expected outputs are contained in the actual outputs.
- Parameters:
expected – expected raw output string(s)
actual – actual output string
- Raises:
RuntimeError – the expected output is not found in the actual output
- spack.install_test.copy_test_files(pkg: Pb, test_spec: Spec)[source]
Copy the spec’s cached and custom test files to the test stage directory.
- Parameters:
pkg – package being tested
test_spec – spec being tested, where the spec may be virtual
- Raises:
TestSuiteError – package must be part of an active test suite
- spack.install_test.find_required_file(root: str, filename: str, expected: int = 1, recursive: bool = True) str | List[str] [source]
Find the required file(s) under the root directory.
- Parameters:
root – root directory for the search
filename – name of the file being located
expected – expected number of files to be found under the directory (default is 1)
recursive –
True
if subdirectories are to be recursively searched, elseFalse
(default isTrue
)
Returns: the path(s), relative to root, to the required file(s)
- Raises:
Exception – SkipTest when number of files detected does not match expected
- spack.install_test.get_all_test_suites()[source]
Retrieves all validly staged TestSuites
- Returns:
a list of TestSuite objects, which may be empty if there are none
- Return type:
- spack.install_test.get_escaped_text_output(filename: str) List[str] [source]
Retrieve and escape the expected text output from the file
- Parameters:
filename – path to the file
- Returns:
escaped text lines read from the file
- spack.install_test.get_named_test_suites(name)[source]
Retrieves test suites with the provided name.
- spack.install_test.get_test_stage_dir()[source]
Retrieves the
config:test_stage
path to the configured test stage root directory- Returns:
- absolute path to the configured test stage root or, if none,
the default test stage path
- Return type:
- spack.install_test.get_test_suite(name: str) TestSuite | None [source]
Ensure there is only one matching test suite with the provided name.
- Returns:
the name if one matching test suite, else None
- Raises:
TestSuiteNameError – If there are more than one matching TestSuites
- spack.install_test.install_test_root(pkg: Pb)[source]
The install test root directory.
- Parameters:
pkg – package being tested
- spack.install_test.overall_status(current_status: TestStatus, substatuses: List[TestStatus]) TestStatus [source]
Determine the overall status based on the current and associated sub status values.
- Parameters:
current_status – current overall status, assumed to default to PASSED
substatuses – status of each test part or overall status of each test spec
- Returns:
test status encompassing the main test and all subtests
- spack.install_test.print_message(logger: nixlog | winlog, msg: str, verbose: bool = False)[source]
Print the message to the log, optionally echoing.
- Parameters:
logger – instance of the output logger (e.g. nixlog or winlog)
msg – message being output
verbose –
True
displays verbose output,False
suppresses it (False
is default)
- spack.install_test.process_test_parts(pkg: Pb, test_specs: List[Spec], verbose: bool = False)[source]
Process test parts associated with the package.
- Parameters:
pkg – package being tested
test_specs – list of test specs
verbose – Display verbose output (suppress by default)
- Raises:
TestSuiteError – package must be part of an active test suite
- spack.install_test.results_filename = 'results.txt'
Name of the test suite results (summary) file
- spack.install_test.spack_install_test_log = 'install-time-test-log.txt'
Name of the Spack install phase-time test log file
- spack.install_test.test_function_names(pkg: Pb | Type[Pb], add_virtuals: bool = False) List[str] [source]
Grab the names of all non-empty test functions.
- Parameters:
pkg – package or package class of interest
add_virtuals –
True
adds test methods of provided package virtual,False
only returns test functions of the package
- Returns:
names of non-empty test functions
- Raises:
ValueError – occurs if pkg is not a package class
- spack.install_test.test_functions(pkg: Pb | Type[Pb], add_virtuals: bool = False) List[Tuple[str, Callable]] [source]
Grab all non-empty test functions.
- Parameters:
pkg – package or package class of interest
add_virtuals –
True
adds test methods of provided package virtual,False
only returns test functions of the package
- Returns:
list of non-empty test functions’ (name, function)
- Raises:
ValueError – occurs if pkg is not a package class
- spack.install_test.test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = '.', verbose: bool = False)[source]
- spack.install_test.test_suite_filename = 'test_suite.lock'
Name of the test suite’s (JSON) lock file
- spack.install_test.virtuals(pkg)[source]
Return a list of unique virtuals for the package.
- Parameters:
pkg – package of interest
Returns: names of unique virtual packages
spack.installer module
This module encapsulates package installation functionality.
The PackageInstaller coordinates concurrent builds of packages for the same Spack instance by leveraging the dependency DAG and file system locks. It also proceeds with the installation of non-dependent packages of failed dependencies in order to install as many dependencies of a package as possible.
Bottom-up traversal of the dependency DAG while prioritizing packages with no uninstalled dependencies allows multiple processes to perform concurrent builds of separate packages associated with a spec.
File system locks enable coordination such that no two processes attempt to build the same or a failed dependency package.
Failures to install dependency packages result in removal of their dependents’ build tasks from the current process. A failure file is also written (and locked) so that other processes can detect the failure and adjust their build tasks accordingly.
This module supports the coordination of local and distributed concurrent installations of packages in a Spack instance.
- exception spack.installer.BadInstallPhase(pkg_name, phase)[source]
Bases:
InstallError
Raised for an install phase option is not allowed for a package.
- class spack.installer.BuildProcessInstaller(pkg, install_args)[source]
Bases:
object
This class implements the part installation that happens in the child process.
- class spack.installer.BuildRequest(pkg, install_args)[source]
Bases:
object
Class for representing an installation request.
- get_deptypes(pkg)[source]
Determine the required dependency types for the associated package.
- Parameters:
pkg (spack.package_base.PackageBase) – explicit or implicit package being installed
- Returns:
required dependency type(s) for the package
- Return type:
- has_dependency(dep_id)[source]
Returns
True
if the package id represents a known dependency of the requested package,False
otherwise.
- run_tests(pkg)[source]
Determine if the tests should be run for the provided packages
- Parameters:
pkg (spack.package_base.PackageBase) – explicit or implicit package being installed
- Returns:
True
if they should be run;False
otherwise- Return type:
- property spec
The specification associated with the package.
- class spack.installer.BuildTask(pkg, request, compiler, start, attempts, status, installed)[source]
Bases:
object
Class for representing the build task for a package.
- add_dependent(pkg_id)[source]
Ensure the dependent package id is in the task’s list so it will be properly updated when this package is installed.
- Parameters:
pkg_id (str) – package identifier of the dependent package
- property cache_only
- property explicit
The package was explicitly requested by the user.
- flag_installed(installed)[source]
Ensure the dependency is not considered to still be uninstalled.
- Parameters:
installed (list) – the identifiers of packages that have been installed so far
- property is_root
The package was requested directly, but may or may not be explicit in an environment.
- property key
The key is the tuple (# uninstalled dependencies, sequence).
- property priority
The priority is based on the remaining uninstalled dependencies.
- property use_cache
- exception spack.installer.ExternalPackageError(message, long_msg=None, pkg=None)[source]
Bases:
InstallError
Raised by install() when a package is only for external use.
- class spack.installer.InstallAction[source]
Bases:
object
- INSTALL = 1
Do a standard install
- NONE = 0
Don’t perform an install
- OVERWRITE = 2
Do an overwrite install
- exception spack.installer.InstallError(message, long_msg=None, pkg=None)[source]
Bases:
SpackError
Raised when something goes wrong during install or uninstall.
The error can be annotated with a
pkg
attribute to allow the caller to get the package for which the exception was raised.
- exception spack.installer.InstallLockError(message, long_msg=None, pkg=None)[source]
Bases:
InstallError
Raised during install when something goes wrong with package locking.
- class spack.installer.PackageInstaller(installs=[])[source]
Bases:
object
Class for managing the install process for a Spack instance based on a bottom-up DAG approach.
This installer can coordinate concurrent batch and interactive, local and distributed (on a shared file system) builds for the same Spack instance.
- spack.installer.STATUS_ADDED = 'queued'
Build status indicating task has been added.
- spack.installer.STATUS_DEQUEUED = 'dequeued'
Build status indicating the task has been popped from the queue
- spack.installer.STATUS_FAILED = 'failed'
Build status indicating the spec failed to install
- spack.installer.STATUS_INSTALLED = 'installed'
Build status indicating the spec was sucessfully installed
- spack.installer.STATUS_INSTALLING = 'installing'
Build status indicating the spec is being installed (possibly by another process)
- spack.installer.STATUS_REMOVED = 'removed'
Build status indicating task has been removed (to maintain priority queue invariants).
- class spack.installer.TermStatusLine(enabled)[source]
Bases:
object
This class is used in distributed builds to inform the user that other packages are being installed by another process.
- exception spack.installer.UpstreamPackageError(message, long_msg=None, pkg=None)[source]
Bases:
InstallError
Raised during install when something goes wrong with an upstream package.
- spack.installer.archive_install_logs(pkg, phase_log_dir)[source]
Copy install logs to their destination directory(ies) :param pkg: the package that was built and installed :type pkg: spack.package_base.PackageBase :param phase_log_dir: path to the archive directory :type phase_log_dir: str
- spack.installer.build_process(pkg, install_args)[source]
Perform the installation/build of the package.
This runs in a separate child process, and has its own process and python module space set up by build_environment.start_build_process().
This essentially wraps an instance of
BuildProcessInstaller
so that we can more easily create one in a subprocess.This function’s return value is returned to the parent process.
- Parameters:
pkg (spack.package_base.PackageBase) – the package being installed.
install_args (dict) – arguments to do_install() from parent process.
- spack.installer.clear_failures()[source]
Remove all failure tracking markers for the Spack instance.
- spack.installer.combine_phase_logs(phase_log_files, log_path)[source]
Read set or list of logs and combine them into one file.
Each phase will produce it’s own log, so this function aims to cat all the separate phase log output files into the pkg.log_path. It is written generally to accept some list of files, and a log path to combine them to.
- spack.installer.dump_packages(spec, path)[source]
Dump all package information for a spec and its dependencies.
This creates a package repository within path for every namespace in the spec DAG, and fills the repos with package files and patch files for every node in the DAG.
- Parameters:
spec (spack.spec.Spec) – the Spack spec whose package information is to be dumped
path (str) – the path to the build packages directory
- spack.installer.get_dependent_ids(spec)[source]
Return a list of package ids for the spec’s dependents
- Parameters:
spec (spack.spec.Spec) – Concretized spec
- Returns:
list of package ids
- Return type:
- spack.installer.log(pkg)[source]
Copy provenance into the install directory on success
- Parameters:
pkg (spack.package_base.PackageBase) – the package that was built and installed
- spack.installer.package_id(pkg)[source]
A “unique” package identifier for installation purposes
The identifier is used to track build tasks, locks, install, and failure statuses.
The identifier needs to distinguish between combinations of compilers and packages for combinatorial environments.
- Parameters:
pkg (spack.package_base.PackageBase) – the package from which the identifier is derived
- spack.installer.print_install_test_log(pkg: PackageBase)[source]
Output install test log file path but only if have test failures.
- Parameters:
pkg – instance of the package under test
spack.main module
This is the implementation of the Spack command line executable.
In a normal Spack installation, this is invoked from the bin/spack script after the system path is set up.
- spack.main.SHOW_BACKTRACE = False
Whether to print backtraces on error
- class spack.main.SpackArgumentParser(prog=None, usage=None, description=None, epilog=None, parents=[], formatter_class=<class 'argparse.HelpFormatter'>, prefix_chars='-', fromfile_prefix_chars=None, argument_default=None, conflict_handler='error', add_help=True, allow_abbrev=True, exit_on_error=True)[source]
Bases:
ArgumentParser
- class spack.main.SpackCommand(command_name, subprocess=False)[source]
Bases:
object
Callable object that invokes a spack command (for testing).
Example usage:
install = SpackCommand('install') install('-v', 'mpich')
Use this to invoke Spack commands directly from Python and check their output.
- exception spack.main.SpackCommandError[source]
Bases:
Exception
Raised when SpackCommand execution fails.
- class spack.main.SpackHelpFormatter(prog, indent_increment=2, max_help_position=24, width=None)[source]
Bases:
RawTextHelpFormatter
- spack.main.aliases = {'rm': 'remove'}
top-level aliases for Spack commands
- spack.main.allows_unknown_args(command)[source]
Implements really simple argument injection for unknown arguments.
Commands may add an optional argument called “unknown args” to indicate they can handle unknonwn args, and we’ll pass the unknown args in.
- spack.main.finish_parse_and_run(parser, cmd_name, env_format_error)[source]
Finish parsing after we know the command to run.
- spack.main.get_spack_commit()[source]
Get the Spack git commit sha.
- Returns:
(str or None) the commit sha if available, otherwise None
- spack.main.get_version()[source]
Get a descriptive version of this instance of Spack.
Outputs ‘<PEP440 version> (<git commit sha>)’.
The commit sha is only added when available.
- spack.main.intro_by_level = {'long': 'Complete list of spack commands:', 'short': 'These are common spack commands:'}
intro text for help at different levels
- spack.main.levels = ['short', 'long']
help levels in order of detail (i.e., number of commands shown)
- spack.main.main(argv=None)[source]
This is the entry point for the Spack command.
main()
itself is just an error handler – it handles errors for everything in Spack that makes it to the top level.The logic is all in
_main()
.- Parameters:
argv (list or None) – command line arguments, NOT including the executable name. If None, parses from sys.argv.
- spack.main.make_argument_parser(**kwargs)[source]
Create an basic argument parser without any subcommands added.
- spack.main.options_by_level = {'long': 'all', 'short': ['h', 'k', 'V', 'color']}
control top-level spack options shown in basic vs. advanced help
- spack.main.print_setup_info(*info)[source]
Print basic information needed by setup-env.[c]sh.
- Parameters:
info (list) – list of things to print: comma-separated list of ‘csh’, ‘sh’, or ‘modules’
This is in
main.py
to make it fast; the setup scripts need to invoke spack in login scripts, and it needs to be quick.
- spack.main.required_command_properties = ['level', 'section', 'description']
Properties that commands are required to set.
- spack.main.section_descriptions = {'admin': 'administration', 'basic': 'query packages', 'build': 'build packages', 'config': 'configuration', 'developer': 'developer', 'environment': 'environment', 'extensions': 'extensions', 'help': 'more help', 'packaging': 'create packages', 'system': 'system'}
Longer text for each section, to show in help
- spack.main.section_order = {'basic': ['list', 'info', 'find'], 'build': ['fetch', 'stage', 'patch', 'configure', 'build', 'restage', 'install', 'uninstall', 'clean'], 'packaging': ['create', 'edit']}
preferential command order for some sections (e.g., build pipeline is in execution order, not alphabetical)
- spack.main.set_working_dir()[source]
Change the working directory to getcwd, or spack prefix if no cwd.
- spack.main.spack_working_dir = None
Recorded directory where spack command was originally invoked
- spack.main.stat_names = {'calls': (((1, -1),), 'call count'), 'cumtime': (((3, -1),), 'cumulative time'), 'cumulative': (((3, -1),), 'cumulative time'), 'filename': (((4, 1),), 'file name'), 'line': (((5, 1),), 'line number'), 'module': (((4, 1),), 'file name'), 'name': (((6, 1),), 'function name'), 'ncalls': (((1, -1),), 'call count'), 'nfl': (((6, 1), (4, 1), (5, 1)), 'name/file/line'), 'pcalls': (((0, -1),), 'primitive call count'), 'stdname': (((7, 1),), 'standard name'), 'time': (((2, -1),), 'internal time'), 'tottime': (((2, -1),), 'internal time')}
names of profile statistics
spack.mirror module
This file contains code for creating spack mirror directories. A mirror is an organized hierarchy containing specially named archive files. This enabled spack to know where to find files in a mirror if the main server for a particular package is down. Or, if the computer where spack is run is not connected to the internet, it allows spack to download packages directly from a mirror (e.g., on an intranet).
- class spack.mirror.Mirror(fetch_url, push_url=None, name=None)[source]
Bases:
object
Represents a named location for storing source tarballs and binary packages.
Mirrors have a fetch_url that indicate where and how artifacts are fetched from them, and a push_url that indicate where and how artifacts are pushed to them. These two URLs are usually the same.
- property fetch_url
Get the valid, canonicalized fetch URL
- static from_url(url: str)[source]
Create an anonymous mirror by URL. This method validates the URL.
- property name
- property push_url
Get the valid, canonicalized push URL. Returns fetch URL if no custom push URL is defined
- class spack.mirror.MirrorCollection(mirrors=None, scope=None)[source]
Bases:
Mapping
A mapping of mirror names to mirrors.
- exception spack.mirror.MirrorError(msg, long_msg=None)[source]
Bases:
SpackError
Superclass of all mirror-creation related errors.
- class spack.mirror.MirrorReference(cosmetic_path, global_path=None)[source]
Bases:
object
A
MirrorReference
stores the relative paths where you can store a package/resource in a mirror directory.The appropriate storage location is given by
storage_path
. Thecosmetic_path
property provides a reference that a human could generate themselves based on reading the details of the package.A user can iterate over a
MirrorReference
object to get all the possible names that might be used to refer to the resource in a mirror; this includes names generated by previous naming schemes that are no-longer reported bystorage_path
orcosmetic_path
.- property storage_path
- spack.mirror.create(path, specs, skip_unstable_versions=False)[source]
Create a directory to be used as a spack mirror, and fill it with package archives.
- Parameters:
path – Path to create a mirror directory hierarchy in.
specs – Any package versions matching these specs will be added to the mirror.
skip_unstable_versions – if true, this skips adding resources when they do not have a stable archive checksum (as determined by
fetch_strategy.stable_target
)
- Return Value:
Returns a tuple of lists: (present, mirrored, error)
present: Package specs that were already present.
mirrored: Package specs that were successfully mirrored.
error: Package specs that failed to mirror due to some error.
- spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)[source]
Add a single package object to a mirror.
The package object is only required to have an associated spec with a concrete version.
- Parameters:
pkg_obj (spack.package_base.PackageBase) – package object with to be added.
mirror_cache (spack.caches.MirrorCache) – mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats) – statistics on the current mirror
- Returns:
True if the spec was added successfully, False otherwise
- spack.mirror.get_all_versions(specs)[source]
Given a set of initial specs, return a new set of specs that includes each version of each package in the original set.
Note that if any spec in the original set specifies properties other than version, this information will be omitted in the new set; for example; the new set of specs will not include variant settings.
- spack.mirror.get_matching_versions(specs, num_versions=1)[source]
Get a spec for EACH known version matching any spec in the list. For concrete specs, this retrieves the concrete version and, if more than one version per spec is requested, retrieves the latest versions of the package.
- spack.mirror.mirror_archive_paths(fetcher, per_package_ref, spec=None)[source]
Returns a
MirrorReference
object which keeps track of the relative storage path of the resource associated with the specifiedfetcher
.
- spack.mirror.mirror_cache_and_stats(path, skip_unstable_versions=False)[source]
Return both a mirror cache and a mirror stats, starting from the path where a mirror ought to be created.
- Parameters:
path (str) – path to create a mirror directory hierarchy in.
skip_unstable_versions – if true, this skips adding resources when they do not have a stable archive checksum (as determined by
fetch_strategy.stable_target
)
- spack.mirror.require_mirror_name(mirror_name)[source]
Find a mirror by name and raise if it does not exist
- spack.mirror.supported_url_schemes = ('file', 'http', 'https', 'sftp', 'ftp', 's3', 'gs')
What schemes do we support
spack.mixins module
This module contains additional behavior that can be attached to any given package.
- spack.mixins.filter_compiler_wrappers(*files, **kwargs)[source]
Substitutes any path referring to a Spack compiler wrapper with the path of the underlying compiler that has been used.
If this isn’t done, the files will have CC, CXX, F77, and FC set to Spack’s generic cc, c++, f77, and f90. We want them to be bound to whatever compiler they were built with.
- Parameters:
*files – files to be filtered relative to the search root (which is, by default, the installation prefix)
**kwargs –
allowed keyword arguments
- after
specifies after which phase the files should be filtered (defaults to ‘install’)
- relative_root
path relative to prefix where to start searching for the files to be filtered. If not set the install prefix wil be used as the search root. It is highly recommended to set this, as searching from the installation prefix may affect performance severely in some cases.
- ignore_absent, backup
these two keyword arguments, if present, will be forwarded to
filter_file
(see its documentation for more information on their behavior)- recursive
this keyword argument, if present, will be forwarded to
find
(see its documentation for more information on the behavior)
spack.multimethod module
This module contains utilities for using multi-methods in spack. You can think of multi-methods like overloaded methods – they’re methods with the same name, and we need to select a version of the method based on some criteria. e.g., for overloaded methods, you would select a version of the method to call based on the types of its arguments.
In spack, multi-methods are used to ease the life of package authors. They allow methods like install() (or other methods called by install()) to declare multiple versions to be called when the package is instantiated with different specs. e.g., if the package is built with OpenMPI on x86_64,, you might want to call a different install method than if it was built for mpich2 on BlueGene/Q. Likewise, you might want to do a different type of install for different versions of the package.
Multi-methods provide a simple decorator-based syntax for this that avoids overly complicated rat nests of if statements. Obviously, depending on the scenario, regular old conditionals might be clearer, so package authors should use their judgement.
- exception spack.multimethod.MultiMethodError(message)[source]
Bases:
SpackError
Superclass for multimethod dispatch errors
- class spack.multimethod.MultiMethodMeta(name, bases, attr_dict)[source]
Bases:
type
This allows us to track the class’s dict during instantiation.
- exception spack.multimethod.NoSuchMethodError(cls, method_name, spec, possible_specs)[source]
Bases:
SpackError
Raised when we can’t find a version of a multi-method.
- class spack.multimethod.SpecMultiMethod(default=None)[source]
Bases:
object
This implements a multi-method for Spack specs. Packages are instantiated with a particular spec, and you may want to execute different versions of methods based on what the spec looks like. For example, you might want to call a different version of install() for one platform than you call on another.
The SpecMultiMethod class implements a callable object that handles method dispatch. When it is called, it looks through registered methods and their associated specs, and it tries to find one that matches the package’s spec. If it finds one (and only one), it will call that method.
This is intended for use with decorators (see below). The decorator (see docs below) creates SpecMultiMethods and registers method versions with them.
- To register a method, you can do something like this:
mm = SpecMultiMethod() mm.register(“^chaos_5_x86_64_ib”, some_method)
The object registered needs to be a Spec or some string that will parse to be a valid spec.
When the mm is actually called, it selects a version of the method to call based on the sys_type of the object it is called on.
See the docs for decorators below for more details.
spack.package module
spack.util.package is a set of useful build tools and directives for packages.
Everything in this module is automatically imported into Spack package files.
spack.package_base module
This is where most of the action happens in Spack.
The spack package class structure is based strongly on Homebrew (http://brew.sh/), mainly because Homebrew makes it very easy to create packages.
- exception spack.package_base.ActivationError(msg, long_msg=None)[source]
Bases:
ExtensionError
Raised when there are problems activating an extension.
- exception spack.package_base.DependencyConflictError(conflict)[source]
Bases:
SpackError
Raised when the dependencies cannot be flattened as asked for.
- class spack.package_base.DetectablePackageMeta(name, bases, attr_dict)[source]
Bases:
object
Check if a package is detectable and add default implementations for the detection function.
- exception spack.package_base.ExtensionError(message, long_msg=None)[source]
Bases:
PackageError
Superclass for all errors having to do with extension packages.
- spack.package_base.FLAG_HANDLER_TYPE
Allowed URL schemes for spack packages.
alias of
Callable
[[str
,Iterable
[str
]],Tuple
[Optional
[Iterable
[str
]],Optional
[Iterable
[str
]],Optional
[Iterable
[str
]]]]
- exception spack.package_base.InvalidPackageOpError(message, long_msg=None)[source]
Bases:
PackageError
Raised when someone tries perform an invalid operation on a package.
- exception spack.package_base.NoURLError(cls)[source]
Bases:
PackageError
Raised when someone tries to build a URL for a package with no URLs.
- class spack.package_base.PackageBase(spec)[source]
Bases:
WindowsRPath
,PackageViewMixin
This is the superclass for all spack packages.
*The Package class*
At its core, a package consists of a set of software to be installed. A package may focus on a piece of software and its associated software dependencies or it may simply be a set, or bundle, of software. The former requires defining how to fetch, verify (via, e.g., sha256), build, and install that software and the packages it depends on, so that dependencies can be installed along with the package itself. The latter, sometimes referred to as a
no-source
package, requires only defining the packages to be built.Packages are written in pure Python.
There are two main parts of a Spack package:
The package class. Classes contain
directives
, which are special functions, that add metadata (versions, patches, dependencies, and other information) to packages (seedirectives.py
). Directives provide the constraints that are used as input to the concretizer.Package instances. Once instantiated, a package is essentially a software installer. Spack calls methods like
do_install()
on thePackage
object, and it uses those to drive user-implemented methods likepatch()
,install()
, and other build steps. To install software, an instantiated package needs a concrete spec, which guides the behavior of the various install methods.
Packages are imported from repos (see
repo.py
).Package DSL
Look in
lib/spack/docs
or check https://spack.readthedocs.io for the full documentation of the package domain-specific language. That used to be partially documented here, but as it grew, the docs here became increasingly out of date.Package Lifecycle
A package’s lifecycle over a run of Spack looks something like this:
p = Package() # Done for you by spack p.do_fetch() # downloads tarball from a URL (or VCS) p.do_stage() # expands tarball in a temp directory p.do_patch() # applies patches to expanded source p.do_install() # calls package's install() function p.do_uninstall() # removes install directory
although packages that do not have code have nothing to fetch so omit
p.do_fetch()
.There are also some other commands that clean the build area:
p.do_clean() # removes the stage directory entirely p.do_restage() # removes the build directory and # re-expands the archive.
The convention used here is that a
do_*
function is intended to be called internally by Spack commands (inspack.cmd
). These aren’t for package writers to override, and doing so may break the functionality of the Package class.Package creators have a lot of freedom, and they could technically override anything in this class. That is not usually required.
For most use cases. Package creators typically just add attributes like
homepage
and, for a code-based package,url
, or functions such asinstall()
. There are many customPackage
subclasses in thespack.build_systems
package that make things even easier for specific build systems.- classmethod all_patches()[source]
Retrieve all patches associated with the package.
Retrieves patches on the package itself as well as patches on the dependencies of the package.
- property all_urls
A list of all URLs in a package.
Check both class-level and version-specific URLs.
- Returns:
a list of URLs
- Return type:
- all_urls_for_version(version)[source]
Return all URLs derived from version_urls(), url, urls, and list_url (if it contains a version) in a package in that order.
- Parameters:
version (spack.version.Version) – the version for which a URL is sought
- property build_log_path
Return the expected (or current) build log file path. The path points to the staging build file until the software is successfully installed, when it points to the file in the installation directory.
- classmethod build_system_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None] [source]
flag_handler that passes flags to the build system arguments. Any package using build_system_flags must also implement flags_to_build_system_args, or derive from a class that implements it. Currently, AutotoolsPackage and CMakePackage implement it.
- property builder
- cache_extra_test_sources(srcs)[source]
Copy relative source paths to the corresponding install test subdir
This method is intended as an optional install test setup helper for grabbing source files/directories during the installation process and copying them to the installation test subdirectory for subsequent use during install testing.
- property cmake_prefix_paths
- property compiler
Get the spack.compiler.Compiler object used to build this package
- property configure_args_path
Return the configure args file path associated with staging.
- content_hash(content=None)[source]
Create a hash based on the artifacts and patches used to build this package.
- This includes:
source artifacts (tarballs, repositories) used to build;
content hashes (
sha256
’s) of all patches applied by Spack; andcanonicalized contents the
package.py
recipe used to build.
This hash is only included in Spack’s DAG hash for concrete specs, but if it happens to be called on a package with an abstract spec, only applicable (i.e., determinable) portions of the hash will be included.
- classmethod dependencies_of_type(*deptypes)[source]
Get dependencies that can possibly have these deptypes.
This analyzes the package and determines which dependencies can be a certain kind of dependency. Note that they may not always be this kind of dependency, since dependencies can be optional, so something may be a build dependency in one configuration and a run dependency in another.
- do_fetch(mirror_only=False)[source]
Creates a stage directory and downloads the tarball for this package. Working directory will be set to the stage directory.
- do_install(**kwargs)[source]
Called by commands to install a package and or its dependencies.
Package implementations should override install() to describe their build process.
- Parameters:
cache_only (bool) – Fail if binary package unavailable.
dirty (bool) – Don’t clean the build environment before installing.
explicit (bool) – True if package was explicitly installed, False if package was implicitly installed (as a dependency).
fail_fast (bool) – Fail if any dependency fails to install; otherwise, the default is to install as many dependencies as possible (i.e., best effort installation).
fake (bool) – Don’t really build; install fake stub files instead.
force (bool) – Install again, even if already installed.
install_deps (bool) – Install dependencies before installing this package
install_source (bool) – By default, source is not installed, but for debugging it might be useful to keep it around.
keep_prefix (bool) – Keep install prefix on failure. By default, destroys it.
keep_stage (bool) – By default, stage is destroyed only if there are no exceptions during build. Set to True to keep the stage even with exceptions.
restage (bool) – Force spack to restage the package source.
skip_patch (bool) – Skip patch stage of build if True.
stop_before (str) – stop execution before this installation phase (or None)
stop_at (str) – last installation phase to be executed (or None)
tests (bool or list or set) – False to run no tests, True to test all packages, or a list of package names to run tests for some
use_cache (bool) – Install from binary package, if available.
verbose (bool) – Display verbose build output (by default, suppresses it)
- property download_instr
Defines the default manual download instructions. Packages can override the property to provide more information.
- Returns:
default manual download instructions
- Return type:
(str)
- classmethod env_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None] [source]
flag_handler that adds all flags to canonical environment variables.
- property env_mods_path
Return the build environment modifications file path associated with staging.
- property env_path
Return the build environment file path associated with staging.
- extendable = False
Most packages are NOT extendable. Set to True if you want extensions.
- property extendee_args
Spec of the extendee of this package, or None if it is not an extension
- property extendee_spec
Spec of the extendee of this package, or None if it is not an extension
- extends(spec)[source]
Returns True if this package extends the given spec.
If
self.spec
is concrete, this returns whether this package extends the given spec.If
self.spec
is not concrete, this returns whether this package may extend the given spec.
- fetch_remote_versions(concurrency=128)[source]
Find remote versions of this package.
Uses
list_url
and any other URLs listed in the package file.- Returns:
a dictionary mapping versions to URLs
- Return type:
- property fetcher
- find_valid_url_for_version(version)[source]
Returns a URL from which the specified version of this package may be downloaded after testing whether the url is valid. Will try url, urls, and list_url before failing.
- version: class Version
The version for which a URL is sought.
See Class Version (version.py)
- property flag_handler: Callable[[str, Iterable[str]], Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]]
- fullname = 'spack.package_base'
- fullnames = ['spack.package_base']
- global_license_dir = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/etc/spack/licenses'
- property global_license_file
Returns the path where a global license file for this particular package should be stored.
- has_code = True
Most Spack packages are used to install source or binary code while those that do not can be used to install a set of other Spack packages.
- property home
- homepage: str | None = None
Package homepage where users can find more information about the package
- classmethod inject_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None] [source]
flag_handler that injects all flags through the compiler wrapper.
- property install_configure_args_path
Return the configure args file path on successful installation.
- property install_env_path
Return the build environment file path on successful installation.
- property install_log_path
Return the build log file path on successful installation.
- property install_test_root
Return the install test root directory.
- property installed
- property installed_upstream
- property is_extension
- license_comment = '#'
String. Contains the symbol used by the license manager to denote a comment. Defaults to
#
.
- license_files: List[str] = []
List of strings. These are files that the software searches for when looking for a license. All file paths must be relative to the installation directory. More complex packages like Intel may require multiple licenses for individual components. Defaults to the empty list.
- license_required = False
Boolean. If set to
True
, this software requires a license. If set toFalse
, all of thelicense_*
attributes will be ignored. Defaults toFalse
.
- license_url = ''
String. A URL pointing to license setup instructions for the software. Defaults to the empty string.
- license_vars: List[str] = []
List of strings. Environment variables that can be set to tell the software where to look for a license if it is not in the usual location. Defaults to the empty list.
- list_depth = 0
Link depth to which list_url should be searched for new versions
- property log_path
Return the build log file path associated with staging.
- maintainers: List[str] = []
List of strings which contains GitHub usernames of package maintainers. Do not include @ here in order not to unnecessarily ping the users.
- manual_download = False
Boolean. Set to
True
for packages that require a manual download. This is currently used by package sanity tests and generation of a more meaningful fetch failure error.
- metadata_attrs = ['homepage', 'url', 'urls', 'list_url', 'extendable', 'parallel', 'make_jobs', 'maintainers', 'tags']
List of attributes to be excluded from a package’s hash.
- property metadata_dir
Return the install metadata directory.
- module = <module 'spack.package_base' from '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/lib/spack/spack/package_base.py'>
- name = 'package_base'
- namespace = 'spack'
- nearest_url(version)[source]
Finds the URL with the “closest” version to
version
.This uses the following precedence order:
Find the next lowest or equal version with a URL.
If no lower URL, return the next higher URL.
If no higher URL, return None.
List of shared objects that should be replaced with a different library at runtime. Typically includes stub libraries like libcuda.so. When linking against a library listed here, the dependent will only record its soname or filename, not its absolute path, so that the dynamic linker will search for it. Note: accepts both file names and directory names, for example
["libcuda.so", "stubs"]
will ensure libcuda.so and all libraries in the stubs directory are not bound by path.”””
- package_dir = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/lib/spack/spack'
- parallel = True
By default we build in parallel. Subclasses can override this.
- property phase_log_files
Find sorted phase log files written to the staging directory
- classmethod possible_dependencies(transitive=True, expand_virtuals=True, deptype='all', visited=None, missing=None, virtuals=None)[source]
Return dict of possible dependencies of this package.
- Parameters:
transitive (bool or None) – return all transitive dependencies if True, only direct dependencies if False (default True)..
expand_virtuals (bool or None) – expand virtual dependencies into all possible implementations (default True)
deptype (str or tuple or None) – dependency types to consider
visited (dict or None) – dict of names of dependencies visited so far, mapped to their immediate dependencies’ names.
missing (dict or None) – dict to populate with packages and their missing dependencies.
virtuals (set) – if provided, populate with virtuals seen so far.
- Returns:
- dictionary mapping dependency names to their
immediate dependencies
- Return type:
(dict)
Each item in the returned dictionary maps a (potentially transitive) dependency of this package to its possible immediate dependencies. If
expand_virtuals
isFalse
, virtual package names wil be inserted as keys mapped to empty sets of dependencies. Virtuals, if not expanded, are treated as though they have no immediate dependencies.Missing dependencies by default are ignored, but if a missing dict is provided, it will be populated with package names mapped to any dependencies they have that are in no repositories. This is only populated if transitive is True.
Note: the returned dict includes the package itself.
- property prefix
Get the prefix into which this package should be installed.
- provides(vpkg_name)[source]
True if this package provides a virtual package with the specified name
- property rpath
Get the rpath this package links with, as a list of paths.
- property rpath_args
Get the rpath args as a string, with -Wl,-rpath, for each element
- run_test(exe, options=[], expected=[], status=0, installed=False, purpose=None, skip_missing=False, work_dir=None)[source]
Run the test and confirm the expected results are obtained
Log any failures and continue, they will be re-raised later
- Parameters:
exe (str) – the name of the executable
options (str or list) – list of options to pass to the runner
expected (str or list) – list of expected output strings. Each string is a regex expected to match part of the output.
status (int or list) – possible passing status values with 0 meaning the test is expected to succeed
installed (bool) – if
True
, the executable must be in the install prefixpurpose (str) – message to display before running test
skip_missing (bool) – skip the test if the executable is not in the install prefix bin directory or the provided work_dir
work_dir (str or None) – path to the smoke test directory
- run_tests = False
By default do not run tests within package’s install()
- sanity_check_is_dir: List[str] = []
List of prefix-relative directory paths (or a single path). If these do not exist after install, or if they exist but are not directories, sanity checks will fail.
- sanity_check_is_file: List[str] = []
List of prefix-relative file paths (or a single path). If these do not exist after install, or if they exist but are not files, sanity checks fail.
- setup_dependent_package(module, dependent_spec)[source]
Set up Python module-scope variables for dependent packages.
Called before the install() method of dependents.
Default implementation does nothing, but this can be overridden by an extendable package to set up the module of its extensions. This is useful if there are some common steps to installing all extensions for a certain package.
Examples:
Extensions often need to invoke the
python
interpreter from the Python installation being extended. This routine can put apython()
Executable object in the module scope for the extension package to simplify extension installs.MPI compilers could set some variables in the dependent’s scope that point to
mpicc
,mpicxx
, etc., allowing them to be called by common name regardless of which MPI is used.BLAS/LAPACK implementations can set some variables indicating the path to their libraries, since these paths differ by BLAS/LAPACK implementation.
- Parameters:
module (spack.package_base.PackageBase.module) – The Python
module
object of the dependent package. Packages can use this to set module-scope variables for the dependent to use.dependent_spec (spack.spec.Spec) – The spec of the dependent package about to be built. This allows the extendee (self) to query the dependent’s state. Note that this package’s spec is available as
self.spec
.
- setup_dependent_run_environment(env, dependent_spec)[source]
Sets up the run environment of packages that depend on this one.
This is similar to
setup_run_environment
, but it is used to modify the run environments of packages that depend on this one.This gives packages like Python and others that follow the extension model a way to implement common environment or run-time settings for dependencies.
- Parameters:
env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the dependent package is run. Package authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec) – The spec of the dependent package about to be run. This allows the extendee (self) to query the dependent’s state. Note that this package’s spec is available as
self.spec
- setup_run_environment(env)[source]
Sets up the run environment for a package.
- Parameters:
env (spack.util.environment.EnvironmentModifications) – environment modifications to be applied when the package is run. Package authors can call methods on it to alter the run environment.
- property stage
Get the build staging area for this package.
This automatically instantiates a
Stage
object if the package doesn’t have one yet, but it does not create the Stage directory on the filesystem.
- test_requires_compiler: bool = False
Set to
True
to indicate the stand-alone test requires a compiler. It is used to ensure a compiler and build dependencies like ‘cmake’ are available to build a custom test code.
- test_suite: TestSuite | None = None
TestSuite instance used to manage stand-alone tests for 1+ specs.
- property tester
- property times_log_path
Return the times log json file.
- transitive_rpaths = True
When True, add RPATHs for the entire DAG. When False, add RPATHs only for immediate dependencies.
- unit_test_check()[source]
Hook for unit tests to assert things about package internals.
Unit tests can override this function to perform checks after
Package.install
and all post-install hooks run, but before the database is updated.The overridden function may indicate that the install procedure should terminate early (before updating the database) by returning
False
(or any value such thatbool(result)
isFalse
).- Returns:
True
to continue,False
to skipinstall()
- Return type:
(bool)
- update_external_dependencies(extendee_spec=None)[source]
Method to override in package classes to handle external dependencies
- url_for_version(version)[source]
Returns a URL from which the specified version of this package may be downloaded.
- version: class Version
The version for which a URL is sought.
See Class Version (version.py)
- url_version(version)[source]
Given a version, this returns a string that should be substituted into the package’s URL to download that version.
By default, this just returns the version string. Subclasses may need to override this, e.g. for boost versions where you need to ensure that there are _’s in the download URL.
- use_xcode = False
By default do not setup mockup XCode on macOS with Clang
- property version
- classmethod version_urls()[source]
OrderedDict of explicitly defined URLs for versions of this package.
- Returns:
An OrderedDict (version -> URL) different versions of this package, sorted by version.
A version’s URL only appears in the result if it has an an explicitly defined
url
argument. So, this list may be empty if a package only definesurl
at the top level.
- view()[source]
Create a view with the prefix of this package as the root. Extensions added to this view will modify the installation prefix of this package.
- virtual = False
By default, packages are not virtual Virtual packages override this attribute
- property virtuals_provided
virtual packages provided by this package with its spec
- exception spack.package_base.PackageError(message, long_msg=None)[source]
Bases:
SpackError
Raised when something is wrong with a package definition.
- class spack.package_base.PackageMeta(name, bases, attr_dict)[source]
Bases:
PhaseCallbacksMeta
,DetectablePackageMeta
,DirectiveMeta
,MultiMethodMeta
Package metaclass for supporting directives (e.g., depends_on) and phases
- exception spack.package_base.PackageStillNeededError(spec, dependents)[source]
Bases:
InstallError
Raised when package is still needed by another on uninstall.
- class spack.package_base.PackageViewMixin[source]
Bases:
object
This collects all functionality related to adding installed Spack package to views. Packages can customize how they are added to views by overriding these functions.
- add_files_to_view(view, merge_map, skip_if_exists=True)[source]
Given a map of package files to destination paths in the view, add the files to the view. By default this adds all files. Alternative implementations may skip some files, for example if other packages linked into the view already include the file.
- Parameters:
view (spack.filesystem_view.FilesystemView) – the view that’s updated
merge_map (dict) – maps absolute source paths to absolute dest paths for all files in from this package.
skip_if_exists (bool) – when True, don’t link files in view when they already exist. When False, always link files, without checking if they already exist.
- remove_files_from_view(view, merge_map)[source]
Given a map of package files to files currently linked in the view, remove the files from the view. The default implementation removes all files. Alternative implementations may not remove all files. For example if two packages include the same file, it should only be removed when both packages are removed.
- view_destination(view)[source]
The target root directory: each file is added relative to this directory.
- view_file_conflicts(view, merge_map)[source]
Report any files which prevent adding this package to the view. The default implementation looks for any files which already exist. Alternative implementations may allow some of the files to exist in the view (in this case they would be omitted from the results).
- class spack.package_base.WindowsRPath[source]
Bases:
object
Collection of functionality surrounding Windows RPATH specific features
This is essentially meaningless for all other platforms due to their use of RPATH. All methods within this class are no-ops on non Windows. Packages can customize and manipulate this class as they would a genuine RPATH, i.e. adding directories that contain runtime library dependencies
- win_add_library_dependent()[source]
Return extra set of directories that require linking for package
This method should be overridden by packages that produce binaries/libraries/python extension modules/etc that are installed into directories outside a package’s bin, lib, and lib64 directories, but still require linking against one of the packages dependencies, or other components of the package itself. No-op otherwise.
- Returns:
List of additional directories that require linking
- spack.package_base.build_system_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]
flag_handler that passes flags to the build system arguments. Any package using build_system_flags must also implement flags_to_build_system_args, or derive from a class that implements it. Currently, AutotoolsPackage and CMakePackage implement it.
- spack.package_base.deprecated_version(pkg, version)[source]
Return True if the version is deprecated, False otherwise.
- Parameters:
pkg (PackageBase) – The package whose version is to be checked.
version (str or spack.version.StandardVersion) – The version being checked
- spack.package_base.detectable_packages = {}
Registers which are the detectable packages, by repo and package name Need a pass of package repositories to be filled.
- spack.package_base.env_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]
flag_handler that adds all flags to canonical environment variables.
- spack.package_base.flatten_dependencies(spec, flat_dir)[source]
Make each dependency of spec present in dir via symlink.
- spack.package_base.inject_flags(name: str, flags: Iterable[str]) Tuple[Iterable[str] | None, Iterable[str] | None, Iterable[str] | None]
flag_handler that injects all flags through the compiler wrapper.
- spack.package_base.install_dependency_symlinks(pkg, spec, prefix)[source]
Execute a dummy install and flatten dependencies.
This routine can be used in a
package.py
definition by settinginstall = install_dependency_symlinks
.This feature comes in handy for creating a common location for the the installation of third-party libraries.
- spack.package_base.on_package_attributes(**attr_dict)[source]
Decorator: executes instance function only if object has attr valuses.
Executes the decorated method only if at the moment of calling the instance has attributes that are equal to certain values.
- Parameters:
attr_dict (dict) – dictionary mapping attribute names to their required values
- spack.package_base.possible_dependencies(*pkg_or_spec, **kwargs)[source]
Get the possible dependencies of a number of packages.
See
PackageBase.possible_dependencies
for details.
- spack.package_base.preferred_version(pkg)[source]
Returns a sorted list of the preferred versions of the package.
- Parameters:
pkg (PackageBase) – The package whose versions are to be assessed.
- spack.package_base.spack_times_log = 'install_times.json'
Filename of json with total build and phase times (seconds)
spack.package_prefs module
- class spack.package_prefs.PackagePrefs(pkgname, component, vpkg=None, all=True)[source]
Bases:
object
Defines the sort order for a set of specs.
Spack’s package preference implementation uses PackagePrefss to define sort order. The PackagePrefs class looks at Spack’s packages.yaml configuration and, when called on a spec, returns a key that can be used to sort that spec in order of the user’s preferences.
You can use it like this:
# key function sorts CompilerSpecs for mpich in order of preference kf = PackagePrefs(‘mpich’, ‘compiler’) compiler_list.sort(key=kf)
Or like this:
# key function to sort VersionLists for OpenMPI in order of preference. kf = PackagePrefs(‘openmpi’, ‘version’) version_list.sort(key=kf)
Optionally, you can sort in order of preferred virtual dependency providers. To do that, provide ‘providers’ and a third argument denoting the virtual package (e.g.,
mpi
):kf = PackagePrefs(‘trilinos’, ‘providers’, ‘mpi’) provider_spec_list.sort(key=kf)
- classmethod has_preferred_providers(pkgname, vpkg)[source]
Whether specific package has a preferred vpkg providers.
- classmethod has_preferred_targets(pkg_name)[source]
Whether specific package has a preferred vpkg providers.
- exception spack.package_prefs.VirtualInPackagesYAMLError(message, long_message=None)[source]
Bases:
SpackError
Raised when a disallowed virtual is found in packages.yaml
- spack.package_prefs.get_package_dir_permissions(spec)[source]
Return the permissions configured for the spec.
Include the GID bit if group permissions are on. This makes the group attribute sticky for the directory. Package-specific settings take precedent over settings for
all
- spack.package_prefs.get_package_group(spec)[source]
Return the unix group associated with the spec.
Package-specific settings take precedence over settings for
all
- spack.package_prefs.get_package_permissions(spec)[source]
Return the permissions configured for the spec.
Package-specific settings take precedence over settings for
all
spack.package_test module
- spack.package_test.compare_output(current_output, blessed_output)[source]
Compare blessed and current output of executables.
spack.parser module
Parser for spec literals
Here is the EBNF grammar for a spec:
spec = [name] [node_options] { ^ node } |
[name] [node_options] hash |
filename
node = name [node_options] |
[name] [node_options] hash |
filename
node_options = [@(version_list|version_pair)] [%compiler] { variant }
hash = / id
filename = (.|/|[a-zA-Z0-9-_]*/)([a-zA-Z0-9-_./]*)(.json|.yaml)
name = id | namespace id
namespace = { id . }
variant = bool_variant | key_value | propagated_bv | propagated_kv
bool_variant = +id | ~id | -id
propagated_bv = ++id | ~~id | --id
key_value = id=id | id=quoted_id
propagated_kv = id==id | id==quoted_id
compiler = id [@version_list]
version_pair = git_version=vid
version_list = (version|version_range) [ { , (version|version_range)} ]
version_range = vid:vid | vid: | :vid | :
version = vid
git_version = git.(vid) | git_hash
git_hash = [A-Fa-f0-9]{40}
quoted_id = " id_with_ws " | ' id_with_ws '
id_with_ws = [a-zA-Z0-9_][a-zA-Z_0-9-.\s]*
vid = [a-zA-Z0-9_][a-zA-Z_0-9-.]*
id = [a-zA-Z0-9_][a-zA-Z_0-9-]*
Identifiers using the <name>=<value> command, such as architectures and compiler flags, require a space before the name.
There is one context-sensitive part: ids in versions may contain ‘.’, while other ids may not.
There is one ambiguity: since ‘-’ is allowed in an id, you need to put whitespace space before -variant for it to be tokenized properly. You can either use whitespace, or you can just use ~variant since it means the same thing. Spack uses ~variant in directory names and in the canonical form of specs to avoid ambiguity. Both are provided because ~ can cause shell expansion when it is the first character in an id typed on the command line.
- spack.parser.ALL_TOKENS = re.compile('(?P<DEPENDENCY>(\\^))|(?P<VERSION_HASH_PAIR>(@(((git\\.((([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(\\.([a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)|([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))|(([A-Fa-f0-9]{40}))))=(=?([a-zA-Z0-9_][a-)
Regex to scan a valid text
- spack.parser.ANALYSIS_REGEX = re.compile('(?P<DEPENDENCY>(\\^))|(?P<VERSION_HASH_PAIR>(@(((git\\.((([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(\\.([a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)|([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))|(([A-Fa-f0-9]{40}))))=(=?([a-zA-Z0-9_][a-)
Regex to analyze an invalid text
- spack.parser.ERROR_HANDLING_REGEXES = ['(?P<DEPENDENCY>(\\^))', '(?P<VERSION_HASH_PAIR>(@(((git\\.((([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(\\.([a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)|([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))|(([A-Fa-f0-9]{40}))))=(=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))))', '(?P<VERSION>(@\\s*(((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(\\s*[,]\\s*((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<PROPAGATED_BOOL_VARIANT>((\\+\\+|~~|--)\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<BOOL_VARIANT>([~+-]\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<PROPAGATED_KEY_VALUE_PAIR>([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*\\s*==\\s*(([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|[\\"\']+([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\\\s]+)[\\"\']+)))', '(?P<KEY_VALUE_PAIR>([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*\\s*=\\s*(([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|[\\"\']+([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\\\s]+)[\\"\']+)))', '(?P<COMPILER_AND_VERSION>(%\\s*([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)([\\s]*)@\\s*(((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(\\s*[,]\\s*((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<COMPILER>(%\\s*([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)))', '(?P<FILENAME>((\\.|\\/|[a-zA-Z0-9-_]*\\/)([a-zA-Z0-9-_\\.\\/]*)(\\.json|\\.yaml)))', '(?P<FULLY_QUALIFIED_PACKAGE_NAME>((([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(\\.([a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)))', '(?P<UNQUALIFIED_PACKAGE_NAME>(([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))', '(?P<DAG_HASH>(/([a-zA-Z_0-9]+)))', '(?P<WS>(\\s+))', '(?P<UNEXPECTED>(.[\\s]*))']
List of all valid regexes followed by error analysis regexes
- class spack.parser.ErrorTokenType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]
Bases:
TokenBase
Enum with regexes for error analysis
- UNEXPECTED = 1
- class spack.parser.FileParser(ctx)[source]
Bases:
object
Parse a single spec from a JSON or YAML file
- ctx
- spack.parser.IDENTIFIER = '([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)'
Valid name for specs and variants. Here we are not using the previous “w[w.-]*” since that would match most characters that can be part of a word in any language
- class spack.parser.SpecNodeParser(ctx)[source]
Bases:
object
Parse a single spec node from a stream of tokens
- ctx
- has_compiler
- has_hash
- has_version
- class spack.parser.SpecParser(literal_str: str)[source]
Bases:
object
Parse text into specs
- ctx
- literal_str
- exception spack.parser.SpecParsingError(message, token, text)[source]
Bases:
SpecSyntaxError
Error when parsing tokens
- exception spack.parser.SpecTokenizationError(matches, text)[source]
Bases:
SpecSyntaxError
Syntax error in a spec string
- spack.parser.TOKEN_REGEXES = ['(?P<DEPENDENCY>(\\^))', '(?P<VERSION_HASH_PAIR>(@(((git\\.((([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(\\.([a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)|([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))|(([A-Fa-f0-9]{40}))))=(=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))))', '(?P<VERSION>(@\\s*(((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(\\s*[,]\\s*((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<PROPAGATED_BOOL_VARIANT>((\\+\\+|~~|--)\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<BOOL_VARIANT>([~+-]\\s*[a-zA-Z_0-9][a-zA-Z_0-9\\-.]*))', '(?P<PROPAGATED_KEY_VALUE_PAIR>([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*\\s*==\\s*(([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|[\\"\']+([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\\\s]+)[\\"\']+)))', '(?P<KEY_VALUE_PAIR>([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*\\s*=\\s*(([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\]+)|[\\"\']+([a-zA-Z_0-9\\-+\\*.,:=\\~\\/\\\\\\s]+)[\\"\']+)))', '(?P<COMPILER_AND_VERSION>(%\\s*([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)([\\s]*)@\\s*(((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b))(\\s*[,]\\s*((=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|:\\s*=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)(?!\\s*=)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)\\s*:|:)|=?([a-zA-Z0-9_][a-zA-Z_0-9\\-\\.]*\\b)))*)))', '(?P<COMPILER>(%\\s*([a-zA-Z_0-9][a-zA-Z_0-9\\-.]*)))', '(?P<FILENAME>((\\.|\\/|[a-zA-Z0-9-_]*\\/)([a-zA-Z0-9-_\\.\\/]*)(\\.json|\\.yaml)))', '(?P<FULLY_QUALIFIED_PACKAGE_NAME>((([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)(\\.([a-zA-Z_0-9][a-zA-Z_0-9\\-]*))+)))', '(?P<UNQUALIFIED_PACKAGE_NAME>(([a-zA-Z_0-9][a-zA-Z_0-9\\-]*)))', '(?P<DAG_HASH>(/([a-zA-Z_0-9]+)))', '(?P<WS>(\\s+))']
List of all the regexes used to match spec parts, in order of precedence
- class spack.parser.Token(kind: TokenType, value: str, start: int | None = None, end: int | None = None)[source]
Bases:
object
Represents tokens; generated from input by lexer and fed to parse().
- end
- kind
- start
- value
- class spack.parser.TokenBase(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]
Bases:
Enum
Base class for an enum type with a regex value
- class spack.parser.TokenContext(token_stream: Iterator[Token])[source]
Bases:
object
Token context passed around by parsers
- accept(kind: TokenType)[source]
If the next token is of the specified kind, advance the stream and return True. Otherwise return False.
- current_token
- next_token
- token_stream
- class spack.parser.TokenType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]
Bases:
TokenBase
Enumeration of the different token kinds in the spec grammar.
Order of declaration is extremely important, since text containing specs is parsed with a single regex obtained by
"|".join(...)
of all the regex in the order of declaration.- BOOL_VARIANT = 5
- COMPILER = 9
- COMPILER_AND_VERSION = 8
- DAG_HASH = 13
- DEPENDENCY = 1
- FILENAME = 10
- FULLY_QUALIFIED_PACKAGE_NAME = 11
- KEY_VALUE_PAIR = 7
- PROPAGATED_BOOL_VARIANT = 4
- PROPAGATED_KEY_VALUE_PAIR = 6
- UNQUALIFIED_PACKAGE_NAME = 12
- VERSION = 3
- VERSION_HASH_PAIR = 2
- WS = 14
- spack.parser.parse(text: str) List[Spec] [source]
Parse text into a list of strings
- Parameters:
text (str) – text to be parsed
- Returns:
List of specs
spack.patch module
- class spack.patch.FilePatch(pkg, relative_path, level, working_dir, ordering_key=None)[source]
Bases:
Patch
Describes a patch that is retrieved from a file in the repository.
- Parameters:
- property sha256
- exception spack.patch.NoSuchPatchError(message, long_message=None)[source]
Bases:
SpackError
Raised when a patch file doesn’t exist.
- class spack.patch.Patch(pkg, path_or_url, level, working_dir)[source]
Bases:
object
Base class for patches.
- Parameters:
pkg (str) – the package that owns the patch
The owning package is not necessarily the package to apply the patch to – in the case where a dependent package patches its dependency, it is the dependent’s fullname.
- apply(stage)[source]
Apply a patch to source in a stage.
- Parameters:
stage (spack.stage.Stage) – stage where source code lives
- property stage
- class spack.patch.PatchCache(repository, data=None)[source]
Bases:
object
Index of patches used in a repository, by sha256 hash.
This allows us to look up patches without loading all packages. It’s also needed to properly implement dependency patching, as need a way to look up patches that come from packages not in the Spec sub-DAG.
The patch index is structured like this in a file (this is YAML, but we write JSON):
patches: sha256: namespace1.package1: <patch json> namespace2.package2: <patch json> ... etc. ...
- patch_for_package(sha256, pkg)[source]
Look up a patch in the index and build a patch object for it.
- Parameters:
sha256 (str) – sha256 hash to look up
pkg (spack.package_base.PackageBase) – Package object to get patch for.
We build patch objects lazily because building them requires that we have information about the package’s location in its repo.
- exception spack.patch.PatchDirectiveError(message, long_message=None)[source]
Bases:
SpackError
Raised when the wrong arguments are suppled to the patch directive.
- class spack.patch.UrlPatch(pkg, url, level=1, working_dir='.', ordering_key=None, **kwargs)[source]
Bases:
Patch
Describes a patch that is retrieved from a URL.
- Parameters:
- fetch()[source]
Retrieve the patch in a temporary stage and compute self.path
- Parameters:
stage – stage for the package that needs to be patched
- property stage
- spack.patch.apply_patch(stage, patch_path, level=1, working_dir='.')[source]
Apply the patch at patch_path to code in the stage.
- Parameters:
stage (spack.stage.Stage) – stage with code that will be patched
patch_path (str) – filesystem location for the patch to apply
level (int or None) – patch level (default 1)
working_dir (str) – relative path within the stage to change to (default ‘.’)
spack.paths module
Defines paths that are part of Spack’s directory structure.
Do not import other spack
modules here. This module is used
throughout Spack and should bring in a minimal number of external
dependencies.
- spack.paths.bin_path = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/bin'
bin directory in the spack prefix
- spack.paths.default_misc_cache_path = '/home/docs/.spack/cache'
transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)
- spack.paths.default_monitor_path = '/home/docs/.spack/reports/monitor'
spack monitor analysis directories
- spack.paths.default_test_path = '/home/docs/.spack/test'
installation test (spack test) output
- spack.paths.default_user_bootstrap_path = '/home/docs/.spack/bootstrap'
bootstrap store for bootstrapping clingo and other tools
- spack.paths.prefix = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root'
This file lives in $prefix/lib/spack/spack/__file__
- spack.paths.reports_path = '/home/docs/.spack/reports'
junit, cdash, etc. reports about builds
- spack.paths.sbang_script = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/bin/sbang'
The sbang script in the spack installation
- spack.paths.spack_root = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root'
synonym for prefix
- spack.paths.spack_script = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/bin/spack'
The spack script itself
- spack.paths.system_config_path = '/etc/spack'
System configuration location
- spack.paths.user_config_path = '/home/docs/.spack'
User configuration location
- spack.paths.user_repos_cache_path = '/home/docs/.spack/git_repos'
git repositories fetched to compare commits to versions
spack.projections module
spack.provider_index module
Classes and functions to manage providers of virtual dependencies
- class spack.provider_index.ProviderIndex(repository: Repo | RepoPath, specs: List[Spec] | None = None, restrict: bool = False)[source]
Bases:
_IndexBase
- static from_json(stream, repository)[source]
Construct a provider index from its JSON representation.
- Parameters:
stream – stream where to read from the JSON data
- merge(other)[source]
Merge another provider index into this one.
- Parameters:
other (ProviderIndex) – provider index to be merged
- exception spack.provider_index.ProviderIndexError(message, long_message=None)[source]
Bases:
SpackError
Raised when there is a problem with a ProviderIndex.
spack.relocate module
- exception spack.relocate.InstallRootStringError(file_path, root_path)[source]
Bases:
SpackError
- spack.relocate.ensure_binaries_are_relocatable(binaries)[source]
Raise an error if any binary in the list is not relocatable.
- Parameters:
binaries (list) – list of binaries to check
- Raises:
InstallRootStringError – if the file is not relocatable
- spack.relocate.ensure_binary_is_relocatable(filename, paths_to_relocate=None)[source]
Raises if any given or default absolute path is found in the binary (apart from rpaths / load commands).
- Parameters:
filename – absolute path of the file to be analyzed
- Raises:
InstallRootStringError – if the binary contains an absolute path
ValueError – if the filename does not exist or the path is not absolute
- spack.relocate.fixup_macos_rpath(root, filename)[source]
Apply rpath fixups to the given file.
- Parameters:
root – absolute path to the parent directory
filename – relative path to the library or binary
- Returns:
True if fixups were applied, else False
- spack.relocate.fixup_macos_rpaths(spec)[source]
Remove duplicate and nonexistent rpaths.
Some autotools packages write their own
-rpath
entries in addition to those implicitly added by the Spack compiler wrappers. On Linux these duplicate rpaths are eliminated, but on macOS they result in multiple entries which makes it harder to adjust withinstall_name_tool -delete_rpath
.
- spack.relocate.is_binary(filename)[source]
Returns true if a file is binary, False otherwise
- Parameters:
filename – file to be tested
- Returns:
True or False
- spack.relocate.is_relocatable(spec)[source]
Returns True if an installed spec is relocatable.
- Parameters:
spec (spack.spec.Spec) – spec to be analyzed
- Returns:
True if the binaries of an installed spec are relocatable and False otherwise.
- Raises:
ValueError – if the spec is not installed
- spack.relocate.macho_find_paths(orig_rpaths, deps, idpath, old_layout_root, prefix_to_prefix)[source]
Inputs original rpaths from mach-o binaries dependency libraries for mach-o binaries id path of mach-o libraries old install directory layout root prefix_to_prefix dictionary which maps prefixes in the old directory layout to directories in the new directory layout Output paths_to_paths dictionary which maps all of the old paths to new paths
- spack.relocate.macho_make_paths_normal(orig_path_name, rpaths, deps, idpath)[source]
Return a dictionary mapping the relativized rpaths to the original rpaths. This dictionary is used to replace paths in mach-o binaries. Replace ‘@loader_path’ with the dirname of the origname path name in rpaths and deps; idpath is replaced with the original path name
- spack.relocate.macho_make_paths_relative(path_name, old_layout_root, rpaths, deps, idpath)[source]
Return a dictionary mapping the original rpaths to the relativized rpaths. This dictionary is used to replace paths in mach-o binaries. Replace old_dir with relative path from dirname of path name in rpaths and deps; idpath is replaced with @rpath/libname.
- spack.relocate.macholib_get_paths(cur_path)[source]
Get rpaths, dependent libraries, and library id of mach-o objects.
- spack.relocate.make_elf_binaries_relative(new_binaries, orig_binaries, orig_layout_root)[source]
Replace the original RPATHs in the new binaries making them relative to the original layout root.
- spack.relocate.make_link_relative(new_links, orig_links)[source]
Compute the relative target from the original link and make the new link relative.
- spack.relocate.make_macho_binaries_relative(cur_path_names, orig_path_names, old_layout_root)[source]
Replace old RPATHs with paths relative to old_dir in binary files
- spack.relocate.modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths)[source]
This function is used to make machO buildcaches on macOS by replacing old paths with new paths using install_name_tool Inputs: mach-o binary to be modified original rpaths original dependency paths original id path if a mach-o library dictionary mapping paths in old install layout to new install layout
- spack.relocate.modify_object_macholib(cur_path, paths_to_paths)[source]
This function is used when install machO buildcaches on linux by rewriting mach-o loader commands for dependency library paths of mach-o binaries and the id path for mach-o libraries. Rewritting of rpaths is handled by replace_prefix_bin. Inputs mach-o binary to be modified dictionary mapping paths in old install layout to new install layout
- spack.relocate.needs_binary_relocation(m_type, m_subtype)[source]
Returns True if the file with MIME type/subtype passed as arguments needs binary relocation, False otherwise.
- spack.relocate.needs_text_relocation(m_type, m_subtype)[source]
Returns True if the file with MIME type/subtype passed as arguments needs text relocation, False otherwise.
- spack.relocate.new_relocate_elf_binaries(binaries, prefix_to_prefix)[source]
Take a list of binaries, and an ordered dictionary of prefix to prefix mapping, and update the rpaths accordingly.
- spack.relocate.relocate_elf_binaries(binaries, orig_root, new_root, new_prefixes, rel, orig_prefix, new_prefix)[source]
Relocate the binaries passed as arguments by changing their RPATHs.
Use patchelf to get the original RPATHs and then replace them with rpaths in the new directory layout.
New RPATHs are determined from a dictionary mapping the prefixes in the old directory layout to the prefixes in the new directory layout if the rpath was in the old layout root, i.e. system paths are not replaced.
- Parameters:
binaries (list) – list of binaries that might need relocation, located in the new prefix
orig_root (str) – original root to be substituted
new_root (str) – new root to be used, only relevant for relative RPATHs
new_prefixes (dict) – dictionary that maps the original prefixes to where they should be relocated
rel (bool) – True if the RPATHs are relative, False if they are absolute
orig_prefix (str) – prefix where the executable was originally located
new_prefix (str) – prefix where we want to relocate the executable
- spack.relocate.relocate_links(links, prefix_to_prefix)[source]
Relocate links to a new install prefix.
- spack.relocate.relocate_macho_binaries(path_names, old_layout_root, new_layout_root, prefix_to_prefix, rel, old_prefix, new_prefix)[source]
Use macholib python package to get the rpaths, depedent libraries and library identity for libraries from the MachO object. Modify them with the replacement paths queried from the dictionary mapping old layout prefixes to hashes and the dictionary mapping hashes to the new layout prefixes.
- spack.relocate.relocate_text(files, prefixes)[source]
Relocate text file from the original installation prefix to the new prefix.
Relocation also affects the the path in Spack’s sbang script.
- Parameters:
files (list) – Text files to be relocated
prefixes (OrderedDict) – String prefixes which need to be changed
- spack.relocate.relocate_text_bin(binaries, prefixes)[source]
Replace null terminated path strings hard-coded into binaries.
The new install prefix must be shorter than the original one.
- Parameters:
binaries (list) – binaries to be relocated
prefixes (OrderedDict) – String prefixes which need to be changed.
- Raises:
spack.relocate_text.BinaryTextReplaceError – when the new path is longer than the old path
spack.relocate_text module
This module contains pure-Python classes and functions for replacing paths inside text files and binaries.
- class spack.relocate_text.BinaryFilePrefixReplacer(prefix_to_prefix, suffix_safety_size=7)[source]
Bases:
PrefixReplacer
- classmethod binary_text_regex(binary_prefixes, suffix_safety_size=7)[source]
Create a regex that looks for exact matches of prefixes, and also tries to match a C-string type null terminator in a small lookahead window.
- Parameters:
Returns: compiled regex
- classmethod from_strings_or_bytes(prefix_to_prefix: Dict[str | bytes, str | bytes], suffix_safety_size: int = 7) BinaryFilePrefixReplacer [source]
Create a BinaryFilePrefixReplacer from an ordered prefix to prefix map.
- Parameters:
prefix_to_prefix (OrderedDict) – Ordered mapping of prefix to prefix.
suffix_safety_size (int) – Number of bytes to retain at the end of a C-string to avoid binary string-aliasing issues.
- exception spack.relocate_text.BinaryStringReplacementError(file_path, old_len, new_len)[source]
Bases:
SpackError
- exception spack.relocate_text.BinaryTextReplaceError(msg)[source]
Bases:
SpackError
- exception spack.relocate_text.CannotGrowString(old, new)[source]
Bases:
BinaryTextReplaceError
- exception spack.relocate_text.CannotShrinkCString(old, new, full_old_string)[source]
Bases:
BinaryTextReplaceError
- class spack.relocate_text.PrefixReplacer(prefix_to_prefix: Dict[bytes, bytes])[source]
Bases:
object
Base class for applying a prefix to prefix map to a list of binaries or text files. Child classes implement _apply_to_file to do the actual work, which is different when it comes to binaries and text files.
- class spack.relocate_text.TextFilePrefixReplacer(prefix_to_prefix: Dict[bytes, bytes])[source]
Bases:
PrefixReplacer
This class applies prefix to prefix mappings for relocation on text files.
Note that UTF-8 encoding is assumed.
- spack.relocate_text.filter_identity_mappings(prefix_to_prefix)[source]
Drop mappings that are not changed.
spack.repo module
- exception spack.repo.BadRepoError(message, long_message=None)[source]
Bases:
RepoError
Raised when repo layout is invalid.
- exception spack.repo.FailedConstructorError(name, exc_type, exc_obj, exc_tb)[source]
Bases:
RepoError
Raised when a package’s class constructor fails.
- class spack.repo.FastPackageChecker(packages_path)[source]
Bases:
Mapping
Cache that maps package names to the stats obtained on the ‘package.py’ files associated with them.
For each repository a cache is maintained at class level, and shared among all instances referring to it. Update of the global cache is done lazily during instance initialization.
- exception spack.repo.IndexError(message, long_message=None)[source]
Bases:
RepoError
Raised when there’s an error with an index.
- class spack.repo.Indexer(repository)[source]
Bases:
object
Adaptor for indexes that need to be generated when repos are updated.
- needs_update(pkg)[source]
Whether an update is needed when the package file hasn’t changed.
- Returns:
True
if this package needs its indexupdated,
False
otherwise.
- Return type:
(bool)
We already automatically update indexes when package files change, but other files (like patches) may change underneath the package file. This method can be used to check additional package-specific files whenever they’re loaded, to tell the RepoIndex to update the index just for that package.
- exception spack.repo.InvalidNamespaceError(message, long_message=None)[source]
Bases:
RepoError
Raised when an invalid namespace is encountered.
- class spack.repo.MockRepositoryBuilder(root_directory, namespace=None)[source]
Bases:
object
Build a mock repository in a directory
- spack.repo.NOT_PROVIDED = <object object>
Guaranteed unused default value for some functions.
- exception spack.repo.NoRepoConfiguredError(message, long_message=None)[source]
Bases:
RepoError
Raised when there are no repositories configured.
- class spack.repo.PatchIndexer(repository)[source]
Bases:
Indexer
Lifecycle methods for patch cache.
- needs_update()[source]
Whether an update is needed when the package file hasn’t changed.
- Returns:
True
if this package needs its indexupdated,
False
otherwise.
- Return type:
(bool)
We already automatically update indexes when package files change, but other files (like patches) may change underneath the package file. This method can be used to check additional package-specific files whenever they’re loaded, to tell the RepoIndex to update the index just for that package.
- class spack.repo.ProviderIndexer(repository)[source]
Bases:
Indexer
Lifecycle methods for virtual package providers.
- spack.repo.ROOT_PYTHON_NAMESPACE = 'spack.pkg'
Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
- class spack.repo.Repo(root, cache=None)[source]
Bases:
object
Class representing a package repository in the filesystem.
Each package repository must have a top-level configuration file called repo.yaml.
Currently, repo.yaml this must define:
- namespace:
A Python namespace where the repository’s packages should live.
- all_package_classes()[source]
Iterator over all package classes in the repository.
Use this with care, because loading packages is slow.
- all_package_names(include_virtuals=False)[source]
Returns a sorted list of all package names in the Repo.
- dirname_for_package_name(pkg_name)[source]
Get the directory name for a particular package. This is the directory that contains its package.py file.
- dump_provenance(spec, path)[source]
Dump provenance information for a spec to a particular path.
This dumps the package file and any associated patch files. Raises UnknownPackageError if not found.
- filename_for_package_name(pkg_name)[source]
Get the filename for the module we should load for a particular package. Packages for a Repo live in
$root/<package_name>/package.py
This will return a proper package.py path even if the package doesn’t exist yet, so callers will need to ensure the package exists before importing.
- get_pkg_class(pkg_name)[source]
Get the class for the package out of its module.
First loads (or fetches from cache) a module for the package. Then extracts the package class from the module according to Spack’s naming convention.
- property index
Construct the index for this repo lazily.
- is_virtual(pkg_name)[source]
Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that is used to construct the provider index use the
is_virtual_safe
function.- Parameters:
pkg_name (str) – name of the package we want to check
- is_virtual_safe(pkg_name)[source]
Return True if the package with this name is virtual, False otherwise.
This function doesn’t use the provider index.
- Parameters:
pkg_name (str) – name of the package we want to check
- property patch_index
Index of patches and packages they’re defined on.
- property provider_index
A provider index with names specific to this repo.
- real_name(import_name)[source]
Allow users to import Spack packages using Python identifiers.
A python identifier might map to many different Spack package names due to hyphen/underscore ambiguity.
- Easy example:
num3proxy -> 3proxy
- Ambiguous:
foo_bar -> foo_bar, foo-bar
- More ambiguous:
foo_bar_baz -> foo_bar_baz, foo-bar-baz, foo_bar-baz, foo-bar_baz
- property tag_index
Index of tags and which packages they’re defined on.
- exception spack.repo.RepoError(message, long_message=None)[source]
Bases:
SpackError
Superclass for repository-related errors.
- class spack.repo.RepoIndex(package_checker: FastPackageChecker, namespace: str, cache: FileCache)[source]
Bases:
object
Container class that manages a set of Indexers for a Repo.
This class is responsible for checking packages in a repository for updates (using
FastPackageChecker
) and for regenerating indexes when they’re needed.Indexers
should be added to theRepoIndex
usingadd_indexer(name, indexer)
, and they should support the interface defined byIndexer
, so that theRepoIndex
can read, generate, and update stored indices.Generated indexes are accessed by name via
__getitem__()
.
- class spack.repo.RepoLoader(fullname, repo, package_name)[source]
Bases:
_PrependFileLoader
Loads a Python module associated with a package in specific repository
- class spack.repo.RepoPath(*repos, **kwargs)[source]
Bases:
object
A RepoPath is a list of repos that function as one.
It functions exactly like a Repo, but it operates on the combined results of the Repos in its list instead of on a single package repository.
- Parameters:
repos (list) – list Repo objects or paths to put in this RepoPath
- dump_provenance(spec, path)[source]
Dump provenance information for a spec to a particular path.
This dumps the package file and any associated patch files. Raises UnknownPackageError if not found.
- exists(pkg_name)[source]
Whether package with the give name exists in the path’s repos.
Note that virtual packages do not “exist”.
- get_repo(namespace, default=<object object>)[source]
Get a repository by namespace.
- Parameters:
namespace – Look up this namespace in the RepoPath, and return it if found.
Optional Arguments:
default:
If default is provided, return it when the namespace isn’t found. If not, raise an UnknownNamespaceError.
- is_virtual(pkg_name)[source]
Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that is used to construct the provider index use the
is_virtual_safe
function.- Parameters:
pkg_name (str) – name of the package we want to check
- is_virtual_safe(pkg_name)[source]
Return True if the package with this name is virtual, False otherwise.
This function doesn’t use the provider index.
- Parameters:
pkg_name (str) – name of the package we want to check
- property patch_index
Merged PatchIndex from all Repos in the RepoPath.
- property provider_index
Merged ProviderIndex from all Repos in the RepoPath.
- property tag_index
Merged TagIndex from all Repos in the RepoPath.
- class spack.repo.ReposFinder[source]
Bases:
object
MetaPathFinder class that loads a Python module corresponding to a Spack package
Return a loader based on the inspection of the current global repository list.
- class spack.repo.TagIndexer(repository)[source]
Bases:
Indexer
Lifecycle methods for a TagIndex on a Repo.
- exception spack.repo.UnknownEntityError(message, long_message=None)[source]
Bases:
RepoError
Raised when we encounter a package spack doesn’t have.
- exception spack.repo.UnknownNamespaceError(namespace, name=None)[source]
Bases:
UnknownEntityError
Raised when we encounter an unknown namespace
- exception spack.repo.UnknownPackageError(name, repo=None)[source]
Bases:
UnknownEntityError
Raised when we encounter a package spack doesn’t have.
- spack.repo.all_package_names(include_virtuals=False)[source]
Convenience wrapper around
spack.repo.all_package_names()
.
- spack.repo.autospec(function)[source]
Decorator that automatically converts the first argument of a function to a Spec.
- spack.repo.create(configuration)[source]
Create a RepoPath from a configuration object.
- Parameters:
configuration (spack.config.Configuration) – configuration object
- spack.repo.create_or_construct(path, namespace=None)[source]
Create a repository, or just return a Repo if it already exists.
- spack.repo.create_repo(root, namespace=None, subdir='packages')[source]
Create a new repository in root with the specified namespace.
If the namespace is not provided, use basename of root. Return the canonicalized path and namespace of the created repository.
- spack.repo.diff_packages(rev1, rev2)[source]
Compute packages lists for the two revisions and return a tuple containing all the packages in rev1 but not in rev2 and all the packages in rev2 but not in rev1.
- spack.repo.get_all_package_diffs(type, rev1='HEAD^1', rev2='HEAD')[source]
- Show packages changed, added, or removed (or any combination of those)
since a commit.
- spack.repo.is_package_file(filename)[source]
Determine whether we are in a package file from a repo.
- spack.repo.namespace_from_fullname(fullname)[source]
Return the repository namespace only for the full module name.
For instance:
namespace_from_fullname(‘spack.pkg.builtin.hdf5’) == ‘builtin’
- Parameters:
fullname (str) – full name for the Python module
- spack.repo.python_package_for_repo(namespace)[source]
Returns the full namespace of a repository, given its relative one
For instance:
python_package_for_repo(‘builtin’) == ‘spack.pkg.builtin’
- Parameters:
namespace (str) – repo namespace
- spack.repo.use_repositories(*paths_and_repos, **kwargs)[source]
Use the repositories passed as arguments within the context manager.
- Parameters:
*paths_and_repos – paths to the repositories to be used, or already constructed Repo objects
override (bool) – if True use only the repositories passed as input, if False add them to the top of the list of current repositories.
- Returns:
Corresponding RepoPath object
spack.report module
Tools to produce reports of spec installations
- class spack.report.BuildInfoCollector(specs: List[Spec])[source]
Bases:
InfoCollector
Collect information for the PackageInstaller._install_task method.
- Parameters:
specs – specs whose install information will be recorded
- extract_package_from_signature(instance, *args, **kwargs)[source]
Return the package instance, given the signature of the wrapped function.
- fetch_log(pkg)[source]
Return the stdout log associated with the function being monitored
- Parameters:
pkg – package under consideration
- class spack.report.InfoCollector(wrap_class: Type, do_fn: str, specs: List[Spec])[source]
Bases:
object
Base class for context manager objects that collect information during the execution of certain package functions.
The data collected is available through the
specs
attribute once exited, and it’s organized as a list where each item represents the installation of one spec.- extract_package_from_signature(instance, *args, **kwargs)[source]
Return the package instance, given the signature of the wrapped function.
- fetch_log(pkg: PackageBase) str [source]
Return the stdout log associated with the function being monitored
- Parameters:
pkg – package under consideration
- init_spec_record(input_spec: Spec, record)[source]
Add additional entries to a spec record when entering the collection context.
- on_success(pkg: PackageBase, kwargs, package_record)[source]
Add additional properties on function call success.
- class spack.report.TestInfoCollector(specs: List[Spec], record_directory: str)[source]
Bases:
InfoCollector
Collect information for the PackageBase.do_test method.
- Parameters:
specs – specs whose install information will be recorded
record_directory – record directory for test log paths
- extract_package_from_signature(instance, *args, **kwargs)[source]
Return the package instance, given the signature of the wrapped function.
- fetch_log(pkg: PackageBase)[source]
Return the stdout log associated with the function being monitored
- Parameters:
pkg – package under consideration
- spack.report.build_context_manager(reporter: Reporter, filename: str, specs: List[Spec])[source]
Decorate a package to generate a report after the installation function is executed.
- Parameters:
reporter – object that generates the report
filename – filename for the report
specs – specs that need reporting
- spack.report.test_context_manager(reporter: Reporter, filename: str, specs: List[Spec], raw_logs_dir: str)[source]
Decorate a package to generate a report after the test function is executed.
- Parameters:
reporter – object that generates the report
filename – filename for the report
specs – specs that need reporting
raw_logs_dir – record directory for test log paths
spack.resource module
Describes an optional resource needed for a build.
Typically a bunch of sources that can be built in-tree within another package to enable optional features.
spack.rewiring module
- exception spack.rewiring.PackageNotInstalledError(spliced_spec, build_spec, dep)[source]
Bases:
RewireError
Raised when the build_spec for a splice was not installed.
- exception spack.rewiring.RewireError(message, long_msg=None)[source]
Bases:
SpackError
Raised when something goes wrong with rewiring.
spack.s3_handler module
- class spack.s3_handler.UrllibS3Handler[source]
Bases:
BaseHandler
- class spack.s3_handler.WrapStream(raw)[source]
Bases:
BufferedReader
- detach()[source]
Disconnect this buffer from its underlying raw stream and return it.
After the raw stream has been detached, the buffer is in an unusable state.
- read(*args, **kwargs)[source]
Read and return up to n bytes.
If the argument is omitted, None, or negative, reads and returns all data until EOF.
If the argument is positive, and the underlying raw stream is not ‘interactive’, multiple raw reads may be issued to satisfy the byte count (unless EOF is reached first). But for interactive raw streams (as well as sockets and pipes), at most one raw read will be issued, and a short result does not imply that EOF is imminent.
Returns an empty bytes object on EOF.
Returns None if the underlying raw stream was open in non-blocking mode and no data is available at the moment.
spack.spec module
Spack allows very fine-grained control over how packages are installed and over how they are built and configured. To make this easy, it has its own syntax for declaring a dependence. We call a descriptor of a particular package configuration a “spec”.
The syntax looks like this:
$ spack install mpileaks ^openmpi @1.2:1.4 +debug %intel @12.1 target=zen
0 1 2 3 4 5 6
The first part of this is the command, ‘spack install’. The rest of the line is a spec for a particular installation of the mpileaks package.
The package to install
A dependency of the package, prefixed by ^
A version descriptor for the package. This can either be a specific version, like “1.2”, or it can be a range of versions, e.g. “1.2:1.4”. If multiple specific versions or multiple ranges are acceptable, they can be separated by commas, e.g. if a package will only build with versions 1.0, 1.2-1.4, and 1.6-1.8 of mavpich, you could say:
depends_on(”mvapich@1.0,1.2:1.4,1.6:1.8”)
A compile-time variant of the package. If you need openmpi to be built in debug mode for your package to work, you can require it by adding +debug to the openmpi spec when you depend on it. If you do NOT want the debug option to be enabled, then replace this with -debug. If you would like for the variant to be propagated through all your package’s dependencies use “++” for enabling and “–” or “~~” for disabling.
The name of the compiler to build with.
The versions of the compiler to build with. Note that the identifier for a compiler version is the same ‘@’ that is used for a package version. A version list denoted by ‘@’ is associated with the compiler only if if it comes immediately after the compiler name. Otherwise it will be associated with the current package spec.
The architecture to build with. This is needed on machines where cross-compilation is required
- exception spack.spec.ArchitecturePropagationError(message, long_message=None)[source]
Bases:
SpecError
Raised when the double equal symbols are used to assign the spec’s architecture.
- class spack.spec.CompilerSpec(*args)[source]
Bases:
object
The CompilerSpec field represents the compiler or range of compiler versions that a package should be built with. CompilerSpecs have a name and a version list.
- property concrete
A CompilerSpec is concrete if its versions are concrete and there is an available compiler with the right version.
- constrain(other: CompilerSpec) bool [source]
Intersect self’s versions with other.
Return whether the CompilerSpec changed.
- property display_str
Equivalent to {compiler.name}{@compiler.version} for Specs, without extra @= for readability.
- intersects(other: CompilerSpec) bool [source]
Return True if all concrete specs matching self also match other, otherwise False.
For compiler specs this means that the name of the compiler must be the same for self and other, and that the versions ranges should intersect.
- Parameters:
other – spec to be satisfied
- name
- satisfies(other: CompilerSpec) bool [source]
Return True if all concrete specs matching self also match other, otherwise False.
For compiler specs this means that the name of the compiler must be the same for self and other, and that the version range of self is a subset of that of other.
- Parameters:
other – spec to be satisfied
- property version
- versions
- exception spack.spec.DuplicateArchitectureError(message, long_message=None)[source]
Bases:
SpecError
Raised when the same architecture occurs in a spec twice.
- exception spack.spec.DuplicateCompilerSpecError(message, long_message=None)[source]
Bases:
SpecError
Raised when the same compiler occurs in a spec twice.
- exception spack.spec.DuplicateDependencyError(message, long_message=None)[source]
Bases:
SpecError
Raised when the same dependency occurs in a spec twice.
- exception spack.spec.InconsistentSpecError(message, long_message=None)[source]
Bases:
SpecError
Raised when two nodes in the same spec DAG have inconsistent constraints.
- exception spack.spec.InvalidDependencyError(pkg, deps)[source]
Bases:
SpecError
Raised when a dependency in a spec is not actually a dependency of the package.
- exception spack.spec.MultipleProviderError(vpkg, providers)[source]
Bases:
SpecError
Raised when there is no package that provides a particular virtual dependency.
- exception spack.spec.NoProviderError(vpkg)[source]
Bases:
SpecError
Raised when there is no package that provides a particular virtual dependency.
- class spack.spec.Spec(spec_like=None, normal=False, concrete=False, external_path=None, external_modules=None)[source]
Bases:
object
- abstract_hash = None
- add_dependency_edge(dependency_spec: Spec, *, deptypes: str | List[str] | Tuple[str, ...])[source]
Add a dependency edge to this spec.
- Parameters:
dependency_spec – spec of the dependency
deptypes – dependency types for this edge
- property anonymous
- property build_spec
- clear_cached_hashes(ignore=())[source]
Clears all cached hashes in a Spec, while preserving other properties.
- property concrete
A spec is concrete if it describes a single build of a package.
More formally, a spec is concrete if concretize() has been called on it and it has been marked _concrete.
Concrete specs either can be or have been built. All constraints have been resolved, optional dependencies have been added or removed, a compiler has been chosen, and all variants have values.
- concretized(tests=False)[source]
This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package without modifying this package.
- constrain(other, deps=True)[source]
Intersect self with other in-place. Return True if self changed, False otherwise.
- Parameters:
other – constraint to be added to self
deps – if False, constrain only the root node, otherwise constrain dependencies as well.
- Raises:
spack.error.UnsatisfiableSpecError – when self cannot be constrained
- copy(deps=True, **kwargs)[source]
Make a copy of this spec.
- Parameters:
- Returns:
A copy of this spec.
Examples
Deep copy with dependencies:
spec.copy() spec.copy(deps=True)
Shallow copy (no dependencies):
spec.copy(deps=False)
Only build and run dependencies:
deps=('build', 'run'):
- property cshort_spec
Returns an auto-colorized version of
self.short_spec
.
- dag_hash(length=None)[source]
This is Spack’s default hash, used to identify installations.
Same as the full hash (includes package hash and build/link/run deps). Tells us when package files and any dependencies have changes.
NOTE: Versions of Spack prior to 0.18 only included link and run deps.
- dependencies(name=None, deptype='all')[source]
Return a list of direct dependencies (nodes in the DAG).
- dependents(name=None, deptype='all')[source]
Return a list of direct dependents (nodes in the DAG).
- edges_from_dependents(name=None, deptype='all')[source]
Return a list of edges connecting this node in the DAG to parents.
- edges_to_dependencies(name=None, deptype='all')[source]
Return a list of edges connecting this node in the DAG to children.
- static ensure_no_deprecated(root)[source]
Raise if a deprecated spec is in the dag.
- Parameters:
root (Spec) – root spec to be analyzed
- Raises:
SpecDeprecatedError – if any deprecated spec is found
- static ensure_valid_variants(spec)[source]
Ensures that the variant attached to a spec are valid.
- Parameters:
spec (Spec) – spec to be analyzed
- Raises:
spack.variant.UnknownVariantError – on the first unknown variant found
- eq_dag(other, deptypes=True, vs=None, vo=None)[source]
True if the full dependency DAGs of specs are equal.
- property external
- property external_path
- flat_dependencies(**kwargs)[source]
Return a DependencyMap containing all of this spec’s dependencies with their constraints merged.
If copy is True, returns merged copies of its dependencies without modifying the spec it’s called on.
If copy is False, clears this spec’s dependencies and returns them. This disconnects all dependency links including transitive dependencies, except for concrete specs: if a spec is concrete it will not be disconnected from its dependencies (although a non-concrete spec with concrete dependencies will be disconnected from those dependencies).
- format(format_string='{name}{@versions}{%compiler.name}{@compiler.versions}{compiler_flags}{variants}{arch=architecture}{/abstract_hash}', **kwargs)[source]
Prints out particular pieces of a spec, depending on what is in the format string.
Using the
{attribute}
syntax, any field of the spec can be selected. Those attributes can be recursive. For example,s.format({compiler.version})
will print the version of the compiler.Commonly used attributes of the Spec for format strings include:
name version compiler compiler.name compiler.version compiler_flags variants architecture architecture.platform architecture.os architecture.target prefix
Some additional special-case properties can be added:
hash[:len] The DAG hash with optional length argument spack_root The spack root directory spack_install The spack install directory
The
^
sigil can be used to access dependencies by name.s.format({^mpi.name})
will print the name of the MPI implementation in the spec.The
@
,%
,arch=
, and/
sigils can be used to include the sigil with the printed string. These sigils may only be used with the appropriate attributes, listed below:@ ``{@version}``, ``{@compiler.version}`` % ``{%compiler}``, ``{%compiler.name}`` arch= ``{arch=architecture}`` / ``{/hash}``, ``{/hash:7}``, etc
The
@
sigil may also be used for any other property namedversion
. Sigils printed with the attribute string are only printed if the attribute string is non-empty, and are colored according to the color of the attribute.Sigils are not used for printing variants. Variants listed by name naturally print with their sigil. For example,
spec.format('{variants.debug}')
would print either+debug
or~debug
depending on the name of the variant. Non-boolean variants print asname=value
. To print variant names or values independently, usespec.format('{variants.<name>.name}')
orspec.format('{variants.<name>.value}')
.Spec format strings use
\
as the escape character. Use\{
and\}
for literal braces, and\\
for the literal\
character.
- static from_detection(spec_str, extra_attributes=None)[source]
Construct a spec from a spec string determined during external detection and attach extra attributes to it.
- Parameters:
- Returns:
external spec
- Return type:
- static from_dict(data)[source]
Construct a spec from JSON/YAML.
- Parameters:
data – a nested dict/list data structure read from YAML or JSON.
- static from_json(stream)[source]
Construct a spec from JSON.
- Parameters:
stream – string or file object to read from.
- static from_literal(spec_dict, normal=True)[source]
Builds a Spec from a dictionary containing the spec literal.
The dictionary must have a single top level key, representing the root, and as many secondary level keys as needed in the spec.
The keys can be either a string or a Spec or a tuple containing the Spec and the dependency types.
- Parameters:
Examples
A simple spec
foo
with no dependencies:{'foo': None}
A spec
foo
with a(build, link)
dependencybar
:{'foo': {'bar:build,link': None}}
A spec with a diamond dependency and various build types:
{'dt-diamond': { 'dt-diamond-left:build,link': { 'dt-diamond-bottom:build': None }, 'dt-diamond-right:build,link': { 'dt-diamond-bottom:build,link,run': None } }}
The same spec with a double copy of
dt-diamond-bottom
and no diamond structure:{'dt-diamond': { 'dt-diamond-left:build,link': { 'dt-diamond-bottom:build': None }, 'dt-diamond-right:build,link': { 'dt-diamond-bottom:build,link,run': None } }, normal=False}
Constructing a spec using a Spec object as key:
mpich = Spec('mpich') libelf = Spec('libelf@1.8.11') expected_normalized = Spec.from_literal({ 'mpileaks': { 'callpath': { 'dyninst': { 'libdwarf': {libelf: None}, libelf: None }, mpich: None }, mpich: None }, })
- static from_signed_json(stream)[source]
Construct a spec from clearsigned json spec file.
- Parameters:
stream – string or file object to read from.
- static from_yaml(stream)[source]
Construct a spec from YAML.
- Parameters:
stream – string or file object to read from.
- property fullname
- property installed
Installation status of a package.
- Returns:
True if the package has been installed, False otherwise.
- property installed_upstream
Whether the spec is installed in an upstream repository.
- Returns:
True if the package is installed in an upstream, False otherwise.
- intersects(other: Spec, deps: bool = True) bool [source]
Return True if there exists at least one concrete spec that matches both self and other, otherwise False.
This operation is commutative, and if two specs intersect it means that one can constrain the other.
- Parameters:
other – spec to be checked for compatibility
deps – if True check compatibility of dependency nodes too, if False only check root
- lookup_hash()[source]
Given a spec with an abstract hash, return a copy of the spec with all properties and dependencies by looking up the hash in the environment, store, or finally, binary caches. This is non-destructive.
- node_dict_with_hashes(hash=<spack.hash_types.SpecHashDescriptor object>)[source]
Returns a node_dict of this spec with the dag hash added. If this spec is concrete, the full hash is added as well. If ‘build’ is in the hash_type, the build hash is also added.
- normalize(force=False, tests=False, user_spec_deps=None)[source]
When specs are parsed, any dependencies specified are hanging off the root, and ONLY the ones that were explicitly provided are there. Normalization turns a partial flat spec into a DAG, where:
Known dependencies of the root package are in the DAG.
Each node’s dependencies dict only contains its known direct deps.
There is only ONE unique spec for each package in the DAG.
This includes virtual packages. If there a non-virtual package that provides a virtual package that is in the spec, then we replace the virtual package with the non-virtual one.
TODO: normalize should probably implement some form of cycle detection, to ensure that the spec is actually a DAG.
- property os
- property package
- property package_class
Internal package call gets only the class object for a package. Use this to just get package metadata.
- property patches
Return patch objects for any patch sha256 sums on this Spec.
This is for use after concretization to iterate over any patches associated with this spec.
TODO: this only checks in the package; it doesn’t resurrect old patches from install directories, but it probably should.
- property platform
- property prefix
- process_hash(length=None)[source]
Hash used to transfer specs among processes.
This hash includes build and test dependencies and is only used to serialize a spec and pass it around among processes.
- process_hash_bit_prefix(bits)[source]
Get the first <bits> bits of the DAG hash as an integer type.
- replace_hash()[source]
Given a spec with an abstract hash, attempt to populate all properties and dependencies by looking up the hash in the environment, store, or finally, binary caches. This is destructive.
- property root
Follow dependent links and find the root of this spec’s DAG.
Spack specs have a single root (the package being installed).
- satisfies(other, deps=True)[source]
This checks constraints on common dependencies against each other.
- property short_spec
Returns a version of the spec with the dependencies hashed instead of completely enumerated.
- spec_hash(hash)[source]
Utility method for computing different types of Spec hashes.
- Parameters:
hash (spack.hash_types.SpecHashDescriptor) – type of hash to generate.
- splice(other, transitive)[source]
Splices dependency “other” into this (“target”) Spec, and return the result as a concrete Spec. If transitive, then other and its dependencies will be extrapolated to a list of Specs and spliced in accordingly. For example, let there exist a dependency graph as follows: T | Z<-H In this example, Spec T depends on H and Z, and H also depends on Z. Suppose, however, that we wish to use a different H, known as H’. This function will splice in the new H’ in one of two ways: 1. transitively, where H’ depends on the Z’ it was built with, and the new T* also directly depends on this new Z’, or 2. intransitively, where the new T* and H’ both depend on the original Z. Since the Spec returned by this splicing function is no longer deployed the same way it was built, any such changes are tracked by setting the build_spec to point to the corresponding dependency from the original Spec. TODO: Extend this for non-concrete Specs.
- property spliced
Returns whether or not this Spec is being deployed as built i.e. whether or not this Spec has ever been spliced.
- property target
- to_dict(hash=<spack.hash_types.SpecHashDescriptor object>)[source]
Create a dictionary suitable for writing this spec to YAML or JSON.
This dictionaries like the one that is ultimately written to a
spec.json
file in each Spack installation directory. For example, for sqlite:{ "spec": { "_meta": { "version": 2 }, "nodes": [ { "name": "sqlite", "version": "3.34.0", "arch": { "platform": "darwin", "platform_os": "catalina", "target": "x86_64" }, "compiler": { "name": "apple-clang", "version": "11.0.0" }, "namespace": "builtin", "parameters": { "column_metadata": true, "fts": true, "functions": false, "rtree": false, "cflags": [], "cppflags": [], "cxxflags": [], "fflags": [], "ldflags": [], "ldlibs": [] }, "dependencies": [ { "name": "readline", "hash": "4f47cggum7p4qmp3xna4hi547o66unva", "type": [ "build", "link" ] }, { "name": "zlib", "hash": "uvgh6p7rhll4kexqnr47bvqxb3t33jtq", "type": [ "build", "link" ] } ], "hash": "tve45xfqkfgmzwcyfetze2z6syrg7eaf", }, # ... more node dicts for readline and its dependencies ... ] }
Note that this dictionary starts with the ‘spec’ key, and what follows is a list starting with the root spec, followed by its dependencies in preorder. Each node in the list also has a ‘hash’ key that contains the hash of the node without the hash field included.
In the example, the package content hash is not included in the spec, but if
package_hash
were true there would be an additional field on each node calledpackage_hash
.from_dict()
can be used to read back in a spec that has been converted to a dictionary, serialized, and read back in.
- to_node_dict(hash=<spack.hash_types.SpecHashDescriptor object>)[source]
Create a dictionary representing the state of this Spec.
to_node_dict
creates the content that is eventually hashed by Spack to create identifiers like the DAG hash (seedag_hash()
). Example result ofto_node_dict
for thesqlite
package:{ 'sqlite': { 'version': '3.28.0', 'arch': { 'platform': 'darwin', 'platform_os': 'mojave', 'target': 'x86_64', }, 'compiler': { 'name': 'apple-clang', 'version': '10.0.0', }, 'namespace': 'builtin', 'parameters': { 'fts': 'true', 'functions': 'false', 'cflags': [], 'cppflags': [], 'cxxflags': [], 'fflags': [], 'ldflags': [], 'ldlibs': [], }, 'dependencies': { 'readline': { 'hash': 'zvaa4lhlhilypw5quj3akyd3apbq5gap', 'type': ['build', 'link'], } }, } }
Note that the dictionary returned does not include the hash of the root of the spec, though it does include hashes for each dependency, and (optionally) the package file corresponding to each node.
See
to_dict()
for a “complete” spec hash, with hashes for each node and nodes for each dependency (instead of just their hashes).- Parameters:
hash (spack.hash_types.SpecHashDescriptor) –
- traverse(**kwargs)[source]
Shorthand for
traverse_nodes()
- traverse_edges(**kwargs)[source]
Shorthand for
traverse_edges()
- tree(**kwargs)[source]
Prints out this spec and its dependencies, tree-formatted with indentation.
Status function may either output a boolean or an InstallStatus
- update_variant_validate(variant_name, values)[source]
If it is not already there, adds the variant named variant_name to the spec spec based on the definition contained in the package metadata. Validates the variant and values before returning.
Used to add values to a variant without being sensitive to the variant being single or multi-valued. If the variant already exists on the spec it is assumed to be multi-valued and the values are appended.
- Parameters:
variant_name – the name of the variant to add or append to
values – the value or values (as a tuple) to add/append to the variant
- validate_detection()[source]
Validate the detection of an external spec.
This method is used as part of Spack’s detection protocol, and is not meant for client code use.
- validate_or_raise()[source]
Checks that names and values in this spec are real. If they’re not, it will raise an appropriate exception.
- property version
- property virtual
- exception spack.spec.SpecDeprecatedError(message, long_message=None)[source]
Bases:
SpecError
Raised when a spec concretizes to a deprecated spec or dependency.
- exception spack.spec.SpecParseError(parse_error)[source]
Bases:
SpecError
Wrapper for ParseError for when we’re parsing specs.
- property long_message
- exception spack.spec.UnsatisfiableArchitectureSpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when a spec architecture conflicts with package constraints.
- exception spack.spec.UnsatisfiableCompilerFlagSpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when a spec variant conflicts with package constraints.
- exception spack.spec.UnsatisfiableCompilerSpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when a spec comiler conflicts with package constraints.
- exception spack.spec.UnsatisfiableDependencySpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when some dependency of constrained specs are incompatible
- exception spack.spec.UnsatisfiableProviderSpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when a provider is supplied but constraints don’t match a vpkg requirement
- exception spack.spec.UnsatisfiableSpecNameError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when two specs aren’t even for the same package.
- exception spack.spec.UnsatisfiableVersionSpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when a spec version conflicts with package constraints.
spack.spec_list module
- exception spack.spec_list.InvalidSpecConstraintError(message, long_message=None)[source]
Bases:
SpecListError
Error class for invalid spec constraints at concretize time.
- class spack.spec_list.SpecList(name='specs', yaml_list=None, reference=None)[source]
Bases:
object
- property is_matrix
- property specs_as_constraints
- property specs_as_yaml_list
- exception spack.spec_list.SpecListError(message, long_message=None)[source]
Bases:
SpackError
Error class for all errors related to SpecList objects.
- exception spack.spec_list.UndefinedReferenceError(message, long_message=None)[source]
Bases:
SpecListError
Error class for undefined references in Spack stacks.
spack.stage module
- class spack.stage.DIYStage(path)[source]
Bases:
object
Simple class that allows any directory to be a spack stage. Consequently, it does not expect or require that the source path adhere to the standard directory naming convention.
- property expanded
Returns True since the source_path must exist.
- managed_by_spack = False
- class spack.stage.ResourceStage(url_or_fetch_strategy, root, resource, **kwargs)[source]
Bases:
Stage
- exception spack.stage.RestageError(message, long_message=None)[source]
Bases:
StageError
“Error encountered during restaging.
- class spack.stage.Stage(url_or_fetch_strategy, name=None, mirror_paths=None, keep=False, path=None, lock=True, search_fn=None)[source]
Bases:
object
Manages a temporary stage directory for building.
A Stage object is a context manager that handles a directory where some source code is downloaded and built before being installed. It handles fetching the source code, either as an archive to be expanded or by checking it out of a repository. A stage’s lifecycle looks like this:
with Stage() as stage: # Context manager creates and destroys the # stage directory stage.fetch() # Fetch a source archive into the stage. stage.expand_archive() # Expand the archive into source_path. <install> # Build and install the archive. # (handled by user of Stage)
When used as a context manager, the stage is automatically destroyed if no exception is raised by the context. If an excpetion is raised, the stage is left in the filesystem and NOT destroyed, for potential reuse later.
You can also use the stage’s create/destroy functions manually, like this:
stage = Stage() try: stage.create() # Explicitly create the stage directory. stage.fetch() # Fetch a source archive into the stage. stage.expand_archive() # Expand the archive into source_path. <install> # Build and install the archive. # (handled by user of Stage) finally: stage.destroy() # Explicitly destroy the stage directory.
There are two kinds of stages: named and unnamed. Named stages can persist between runs of spack, e.g. if you fetched a tarball but didn’t finish building it, you won’t have to fetch it again.
Unnamed stages are created using standard mkdtemp mechanisms or similar, and are intended to persist for only one run of spack.
- property archive_file
Path to the source archive within this stage directory.
- cache_mirror(mirror, stats)[source]
Perform a fetch if the resource is not already cached
- Parameters:
mirror (spack.caches.MirrorCache) – the mirror to cache this Stage’s resource in
stats (spack.mirror.MirrorStats) – this is updated depending on whether the caching operation succeeded or failed
- check()[source]
Check the downloaded archive against a checksum digest. No-op if this stage checks code out of a repository.
- expand_archive()[source]
Changes to the stage directory and attempt to expand the downloaded archive. Fail if the stage is not set up or if the archive is not yet downloaded.
- property expanded
Returns True if source path expanded; else False.
- property expected_archive_files
Possible archive file paths.
- managed_by_spack = True
- property save_filename
- property source_path
Returns the well-known source directory path.
- class spack.stage.StageComposite[source]
Bases:
Composite
Composite for Stage type objects. The first item in this composite is considered to be the root package, and operations that return a value are forwarded to it.
- property archive_file
- property expanded
- property path
- property source_path
- exception spack.stage.StageError(message, long_message=None)[source]
Bases:
SpackError
“Superclass for all errors encountered during staging.
- exception spack.stage.StagePathError(message, long_message=None)[source]
Bases:
StageError
“Error encountered with stage path.
- exception spack.stage.VersionFetchError(message, long_message=None)[source]
Bases:
StageError
Raised when we can’t determine a URL to fetch a package.
- spack.stage.create_stage_root(path: str) None [source]
Create the stage root directory and ensure appropriate access perms.
- spack.stage.ensure_access(file)[source]
Ensure we can access a directory and die with an error if we can’t.
- spack.stage.get_checksums_for_versions(url_dict, name, **kwargs)[source]
Fetches and checksums archives from URLs.
This function is called by both
spack checksum
andspack create
. Thefirst_stage_function
argument allows the caller to inspect the first downloaded archive, e.g., to determine the build system.- Parameters:
url_dict (dict) – A dictionary of the form: version -> URL
name (str) – The name of the package
first_stage_function (Callable) – function that takes a Stage and a URL; this is run on the stage of the first URL downloaded
keep_stage (bool) – whether to keep staging area when command completes
batch (bool) – whether to ask user how many versions to fetch (false) or fetch all versions (true)
latest (bool) – whether to take the latest version (true) or all (false)
fetch_options (dict) – Options used for the fetcher (such as timeout or cookies)
- Returns:
A multi-line string containing versions and corresponding hashes
- Return type:
(str)
spack.store module
Components that manage Spack’s installation tree.
An install tree, or “build store” consists of two parts:
A package database that tracks what is installed.
A directory layout that determines how the installations are laid out.
The store contains all the install prefixes for packages installed by Spack. The simplest store could just contain prefixes named by DAG hash, but we use a fancier directory layout to make browsing the store and debugging easier.
- exception spack.store.MatchError(message, long_message=None)[source]
Bases:
SpackError
Error occurring when trying to match specs in store against a constraint
- class spack.store.Store(root, unpadded_root=None, projections=None, hash_length=None)[source]
Bases:
object
A store is a path full of installed Spack packages.
Stores consist of packages installed according to a
DirectoryLayout
, along with an index, or _database_ of their contents. The directory layout controls what paths look like and how Spack ensures that each uniqe spec gets its own unique directory (or not, though we don’t recommend that). The database is a signle file that caches metadata for the entire Spack installation. It prevents us from having to spider the install tree to figure out what’s there.- Parameters:
root (str) – path to the root of the install tree
unpadded_root (str) – path to the root of the install tree without padding; the sbang script has to be installed here to work with padded roots
path_scheme (str) – expression according to guidelines in
spack.util.path
that describes how to construct a path to a package prefix in this storehash_length (int) – length of the hashes used in the directory layout; spec hash suffixes will be truncated to this length
- spack.store.default_install_tree_root = '/home/docs/checkouts/readthedocs.org/user_builds/spack/checkouts/v0.20.3/lib/spack/docs/_spack_root/opt/spack'
default installation root, relative to the Spack install path
- spack.store.find(constraints, multiple=False, query_fn=None, **kwargs)[source]
Return a list of specs matching the constraints passed as inputs.
At least one spec per constraint must match, otherwise the function will error with an appropriate message.
By default, this function queries the current store, but a custom query function can be passed to hit any other source of concretized specs (e.g. a binary cache).
The query function must accept a spec as its first argument.
- Parameters:
constraints (List[spack.spec.Spec]) – specs to be matched against installed packages
multiple (bool) – if True multiple matches per constraint are admitted
query_fn (Callable) – query function to get matching specs. By default,
spack.store.db.query
**kwargs – keyword arguments forwarded to the query function
- Returns:
List of matching specs
- spack.store.parse_install_tree(config_dict)[source]
Parse config settings and return values relevant to the store object.
- Parameters:
config_dict (dict) – dictionary of config values, as returned from spack.config.get(‘config’)
- Returns:
- triple of the install tree root, the unpadded install tree
root (before padding was applied), and the projections for the install tree
- Return type:
(tuple)
Encapsulate backwards compatibility capabilities for install_tree and deprecated values that are now parsed as part of install_tree.
- spack.store.reinitialize()[source]
Restore globals to the same state they would have at start-up. Return a token containing the state of the store before reinitialization.
spack.subprocess_context module
This module handles transmission of Spack state to child processes started using the ‘spawn’ start method. Notably, installations are performed in a subprocess and require transmitting the Package object (in such a way that the repository is available for importing when it is deserialized); installations performed in Spack unit tests may include additional modifications to global state in memory that must be replicated in the child process.
- class spack.subprocess_context.PackageInstallContext(pkg)[source]
Bases:
object
Captures the in-memory process state of a package installation that needs to be transmitted to a child process.
- class spack.subprocess_context.TestState[source]
Bases:
object
Spack tests may modify state that is normally read from disk in memory; this object is responsible for properly serializing that state to be applied to a subprocess. This isn’t needed outside of a testing environment but this logic is designed to behave the same inside or outside of tests.
spack.tag module
Classes and functions to manage package tags
- class spack.tag.TagIndex(repository)[source]
Bases:
Mapping
Maps tags to list of packages.
- merge(other)[source]
Merge another tag index into this one.
- Parameters:
other (TagIndex) – tag index to be merged
- property tags
- exception spack.tag.TagIndexError(message, long_message=None)[source]
Bases:
SpackError
Raised when there is a problem with a TagIndex.
- spack.tag.packages_with_tags(tags, installed, skip_empty)[source]
Returns a dict, indexed by tag, containing lists of names of packages containing the tag or, if no tags, for all available tags.
- Parameters:
tags (list or None) – list of tags of interest or None for all
installed (bool) – True if want names of packages that are installed; otherwise, False if want all packages with the tag
skip_empty (bool) – True if exclude tags with no associated packages; otherwise, False if want entries for all tags even when no such tagged packages
spack.target module
- class spack.target.Target(name, module_name=None)[source]
Bases:
object
- property name
- optimization_flags(compiler)[source]
Returns the flags needed to optimize for this target using the compiler passed as argument.
- Parameters:
compiler (spack.spec.CompilerSpec or spack.compiler.Compiler) – object that contains both the name and the version of the compiler we want to use
- to_dict_or_value()[source]
Returns a dict or a value representing the current target.
String values are used to keep backward compatibility with generic targets, like e.g. x86_64 or ppc64. More specific micro-architectures will return a dictionary which contains information on the name, features, vendor, generation and parents of the current target.
spack.tengine module
- class spack.tengine.Context[source]
Bases:
object
Base class for context classes that are used with the template engine.
- context_properties = []
- class spack.tengine.ContextMeta(name, bases, attr_dict)[source]
Bases:
type
Meta class for Context. It helps reducing the boilerplate in client code.
- spack.tengine.context_property(func)
A saner way to use the decorator
spack.traverse module
- spack.traverse.traverse_edges(specs, root=True, order='pre', cover='nodes', direction='children', deptype='all', depth=False, key=<built-in function id>, visited=None)[source]
Generator that yields edges from the DAG, starting from a list of root specs.
- Parameters:
specs (list) – List of root specs (considered to be depth 0)
root (bool) – Yield the root nodes themselves
order (str) – What order of traversal to use in the DAG. For depth-first search this can be
pre
orpost
. For BFS this should bebreadth
. For topological order usetopo
cover (str) – Determines how extensively to cover the dag. Possible values:
nodes
– Visit each unique node in the dag only once.edges
– If a node has been visited once but is reached along a new path, it’s accepted, but not recurisvely followed. This traverses each ‘edge’ in the DAG once.paths
– Explore every unique path reachable from the root. This descends into visited subtrees and will accept nodes multiple times if they’re reachable by multiple paths.direction (str) –
children
orparents
. Ifchildren
, does a traversal of this spec’s children. Ifparents
, traverses upwards in the DAG towards the root.depth (bool) – When
False
, yield just edges. WhenTrue
yield the tuple (depth, edge), where depth corresponds to the depth at which edge.spec was discovered.key – function that takes a spec and outputs a key for uniqueness test.
visited (set or None) – a set of nodes not to follow
- Returns:
A generator that yields
DependencySpec
if depth isFalse
or a tuple of(depth, DependencySpec)
if depth isTrue
.
- spack.traverse.traverse_nodes(specs, root=True, order='pre', cover='nodes', direction='children', deptype='all', depth=False, key=<built-in function id>, visited=None)[source]
Generator that yields specs from the DAG, starting from a list of root specs.
- Parameters:
specs (list) – List of root specs (considered to be depth 0)
root (bool) – Yield the root nodes themselves
order (str) – What order of traversal to use in the DAG. For depth-first search this can be
pre
orpost
. For BFS this should bebreadth
.cover (str) – Determines how extensively to cover the dag. Possible values:
nodes
– Visit each unique node in the dag only once.edges
– If a node has been visited once but is reached along a new path, it’s accepted, but not recurisvely followed. This traverses each ‘edge’ in the DAG once.paths
– Explore every unique path reachable from the root. This descends into visited subtrees and will accept nodes multiple times if they’re reachable by multiple paths.direction (str) –
children
orparents
. Ifchildren
, does a traversal of this spec’s children. Ifparents
, traverses upwards in the DAG towards the root.depth (bool) – When
False
, yield just edges. WhenTrue
yield the tuple(depth, edge)
, where depth corresponds to the depth at whichedge.spec
was discovered.key – function that takes a spec and outputs a key for uniqueness test.
visited (set or None) – a set of nodes not to follow
- Yields:
By default
Spec
, or a tuple(depth, Spec)
if depth is set toTrue
.
- spack.traverse.traverse_tree(specs, cover='nodes', deptype='all', key=<built-in function id>, depth_first=True)[source]
Generator that yields
(depth, DependencySpec)
tuples in the depth-first pre-order, so that a tree can be printed from it.- Parameters:
specs (list) – List of root specs (considered to be depth 0)
cover (str) – Determines how extensively to cover the dag. Possible values:
nodes
– Visit each unique node in the dag only once.edges
– If a node has been visited once but is reached along a new path, it’s accepted, but not recurisvely followed. This traverses each ‘edge’ in the DAG once.paths
– Explore every unique path reachable from the root. This descends into visited subtrees and will accept nodes multiple times if they’re reachable by multiple paths.key – function that takes a spec and outputs a key for uniqueness test.
depth_first (bool) – Explore the tree in depth-first or breadth-first order. When setting
depth_first=True
andcover=nodes
, each spec only occurs once at the shallowest level, which is useful when rendering the tree in a terminal.
- Returns:
A generator that yields
(depth, DependencySpec)
tuples in such an order that a tree can be printed.
spack.url module
This module has methods for parsing names and versions of packages from URLs. The idea is to allow package creators to supply nothing more than the download location of the package, and figure out version and name information from there.
Example: when spack is given the following URL:
It can figure out that the package name is hdf
, and that it is at version
4.2.12
. This is useful for making the creation of packages simple: a user
just supplies a URL and skeleton code is generated automatically.
Spack can also figure out that it can most likely download 4.2.6 at this URL:
This is useful if a user asks for a package at a particular version number; spack doesn’t need anyone to tell it where to get the tarball even though it’s never been told about that version before.
- exception spack.url.UndetectableNameError(path)[source]
Bases:
UrlParseError
Raised when we can’t parse a package name from a string.
- exception spack.url.UndetectableVersionError(path)[source]
Bases:
UrlParseError
Raised when we can’t parse a version from a string.
- exception spack.url.UrlParseError(msg, path)[source]
Bases:
SpackError
Raised when the URL module can’t parse something correctly.
- spack.url.color_url(path, **kwargs)[source]
Color the parts of the url according to Spack’s parsing.
- Colors are:
- Cyan: The version found by
parse_version_offset()
.Red: The name found byparse_name_offset()
.Green: Instances of version string fromsubstitute_version()
.Magenta: Instances of the name (protected from substitution).
- spack.url.cumsum(elts, init=0, fn=<function <lambda>>)[source]
Return cumulative sum of result of fn on each element in elts.
- spack.url.determine_url_file_extension(path)[source]
This returns the type of archive a URL refers to. This is sometimes confusing because of URLs like:
Where the URL doesn’t actually contain the filename. We need to know what type it is so that we can appropriately name files in mirrors.
- spack.url.find_all(substring, string)[source]
Returns a list containing the indices of every occurrence of substring in string.
- spack.url.find_list_urls(url)[source]
Find good list URLs for the supplied URL.
By default, returns the dirname of the archive path.
Provides special treatment for the following websites, which have a unique list URL different from the dirname of the download URL:
GitHub
https://github.com/<repo>/<name>/releases
GitLab
https://gitlab.*/<repo>/<name>/tags
BitBucket
https://bitbucket.org/<repo>/<name>/downloads/?tab=tags
CRAN
PyPI
https://pypi.org/simple/<name>/
LuaRocks
https://luarocks.org/modules/<repo>/<name>
Note: this function is called by spack versions, spack checksum, and spack create, but not by spack fetch or spack install.
- spack.url.insensitize(string)[source]
Change upper and lowercase letters to be case insensitive in the provided string. e.g., ‘a’ becomes ‘[Aa]’, ‘B’ becomes ‘[bB]’, etc. Use for building regexes.
- spack.url.parse_name(path, ver=None)[source]
Try to determine the name of a package from its filename or URL.
- Parameters:
- Returns:
The name of the package
- Return type:
- Raises:
UndetectableNameError – If the URL does not match any regexes
- spack.url.parse_name_and_version(path)[source]
Try to determine the name of a package and extract its version from its filename or URL.
- Parameters:
path (str) – The filename or URL for the package
- Returns:
a tuple containing the package (name, version)
- Return type:
- Raises:
UndetectableVersionError – If the URL does not match any regexes
UndetectableNameError – If the URL does not match any regexes
- spack.url.parse_name_offset(path, v=None)[source]
Try to determine the name of a package from its filename or URL.
- Parameters:
- Returns:
- A tuple containing:
name of the package, first index of name, length of name, the index of the matching regex, the matching regex
- Return type:
- Raises:
UndetectableNameError – If the URL does not match any regexes
- spack.url.parse_version(path)[source]
Try to extract a version string from a filename or URL.
- Parameters:
path (str) – The filename or URL for the package
- Returns:
The version of the package
- Return type:
- Raises:
UndetectableVersionError – If the URL does not match any regexes
- spack.url.parse_version_offset(path)[source]
Try to extract a version string from a filename or URL.
- Parameters:
path (str) – The filename or URL for the package
- Returns:
- A tuple containing:
version of the package, first index of version, length of version string, the index of the matching regex, the matching regex
- Return type:
- Raises:
UndetectableVersionError – If the URL does not match any regexes
- spack.url.split_url_extension(path)[source]
Some URLs have a query string, e.g.:
https://github.com/losalamos/CLAMR/blob/packages/PowerParser_v2.0.7.tgz?raw=true
http://www.apache.org/dyn/closer.cgi?path=/cassandra/1.2.0/apache-cassandra-1.2.0-rc2-bin.tar.gz
https://gitlab.kitware.com/vtk/vtk/repository/archive.tar.bz2?ref=v7.0.0
In (1), the query string needs to be stripped to get at the extension, but in (2) & (3), the filename is IN a single final query argument.
This strips the URL into three pieces:
prefix
,ext
, andsuffix
. The suffix contains anything that was stripped off the URL to get at the file extension. In (1), it will be'?raw=true'
, but in (2), it will be empty. In (3) the suffix is a parameter that follows after the file extension, e.g.:('https://github.com/losalamos/CLAMR/blob/packages/PowerParser_v2.0.7', '.tgz', '?raw=true')
('http://www.apache.org/dyn/closer.cgi?path=/cassandra/1.2.0/apache-cassandra-1.2.0-rc2-bin', '.tar.gz', None)
('https://gitlab.kitware.com/vtk/vtk/repository/archive', '.tar.bz2', '?ref=v7.0.0')
- spack.url.strip_name_suffixes(path, version)[source]
Most tarballs contain a package name followed by a version number. However, some also contain extraneous information in-between the name and version:
rgb-1.0.6
converge_install_2.3.16
jpegsrc.v9b
These strings are not part of the package name and should be ignored. This function strips the version number and any extraneous suffixes off and returns the remaining string. The goal is that the name is always the last thing in
path
:rgb
converge
jpeg
- spack.url.strip_version_suffixes(path)[source]
Some tarballs contain extraneous information after the version:
bowtie2-2.2.5-source
libevent-2.0.21-stable
cuda_8.0.44_linux.run
These strings are not part of the version number and should be ignored. This function strips those suffixes off and returns the remaining string. The goal is that the version is always the last thing in
path
:bowtie2-2.2.5
libevent-2.0.21
cuda_8.0.44
- spack.url.substitute_version(path, new_version)[source]
Given a URL or archive name, find the version in the path and substitute the new version for it. Replace all occurrences of the version if they don’t overlap with the package name.
Simple example:
substitute_version('http://www.mr511.de/software/libelf-0.8.13.tar.gz', '2.9.3') >>> 'http://www.mr511.de/software/libelf-2.9.3.tar.gz'
Complex example:
substitute_version('https://www.hdfgroup.org/ftp/HDF/releases/HDF4.2.12/src/hdf-4.2.12.tar.gz', '2.3') >>> 'https://www.hdfgroup.org/ftp/HDF/releases/HDF2.3/src/hdf-2.3.tar.gz'
- spack.url.substitution_offsets(path)[source]
This returns offsets for substituting versions and names in the provided path. It is a helper for
substitute_version()
.
spack.user_environment module
- spack.user_environment.environment_modifications_for_spec(spec, view=None, set_package_py_globals=True)[source]
List of environment (shell) modifications to be processed for spec.
This list is specific to the location of the spec or its projection in the view.
- Parameters:
spec (spack.spec.Spec) – spec for which to list the environment modifications
view – view associated with the spec passed as first argument
set_package_py_globals (bool) – whether or not to set the global variables in the package.py files (this may be problematic when using buildcaches that have been built on a different but compatible OS)
- spack.user_environment.prefix_inspections(platform)[source]
Get list of prefix inspections for platform
- Parameters:
platform (str) – the name of the platform to consider. The platform determines what environment variables Spack will use for some inspections.
- Returns:
- A dictionary mapping subdirectory names to lists of environment
variables to modify with that directory if it exists.
- spack.user_environment.spack_loaded_hashes_var = 'SPACK_LOADED_HASHES'
Environment variable name Spack uses to track individually loaded packages
spack.variant module
The variant module contains data structures that are needed to manage variants both in packages and in specs.
- class spack.variant.AbstractVariant(name, value, propagate=False)[source]
Bases:
object
A variant that has not yet decided who it wants to be. It behaves like a multi valued variant which could do things.
This kind of variant is generated during parsing of expressions like
foo=bar
and differs from multi valued variants because it will satisfy any other variant with the same name. This is because it could do it if it grows up to be a multi valued variant with the right set of values.- compatible(other)[source]
Returns True if self and other are compatible, False otherwise.
As there is no semantic check, two VariantSpec are compatible if either they contain the same value or they are both multi-valued.
- Parameters:
other – instance against which we test compatibility
- Returns:
True or False
- Return type:
- constrain(other)[source]
Modify self to match all the constraints for other if both instances are multi-valued. Returns True if self changed, False otherwise.
- Parameters:
other – instance against which we constrain self
- Returns:
True or False
- Return type:
- copy()[source]
Returns an instance of a variant equivalent to self
- Returns:
a copy of self
- Return type:
>>> a = MultiValuedVariant('foo', True) >>> b = a.copy() >>> assert a == b >>> assert a is not b
- intersects(other)[source]
Returns True if there are variant matching both self and other, False otherwise.
- satisfies(other)[source]
Returns true if
other.name == self.name
, because any value that other holds and is not in self yet could be added.- Parameters:
other – constraint to be met for the method to return True
- Returns:
True or False
- Return type:
- property value
Returns a tuple of strings containing the values stored in the variant.
- Returns:
values stored in the variant
- Return type:
- class spack.variant.BoolValuedVariant(name, value, propagate=False)[source]
Bases:
SingleValuedVariant
A variant that can hold either True or False.
BoolValuedVariant can also hold the value ‘*’, for coerced comparisons between
foo=*
and+foo
or~foo
.
- class spack.variant.DisjointSetsOfValues(*sets)[source]
Bases:
Sequence
Allows combinations from one of many mutually exclusive sets.
The value
('none',)
is reserved to denote the empty set and therefore no other set can contain the item'none'
.- Parameters:
*sets (list) – mutually exclusive sets of values
- feature_values
Attribute used to track values which correspond to features which can be enabled or disabled as understood by the package’s build system.
- property validator
- exception spack.variant.DuplicateVariantError(message, long_message=None)[source]
Bases:
SpecError
Raised when the same variant occurs in a spec twice.
- exception spack.variant.InconsistentValidationError(vspec, variant)[source]
Bases:
SpecError
Raised if the wrong validator is used to validate a variant.
- exception spack.variant.InvalidVariantForSpecError(variant, when, spec)[source]
Bases:
SpecError
Raised when an invalid conditional variant is specified.
- exception spack.variant.InvalidVariantValueCombinationError(message, long_message=None)[source]
Bases:
SpecError
Raised when a variant has values ‘*’ or ‘none’ with other values.
- exception spack.variant.InvalidVariantValueError(variant, invalid_values, pkg)[source]
Bases:
SpecError
Raised when a valid variant has at least an invalid value.
- class spack.variant.MultiValuedVariant(name, value, propagate=False)[source]
Bases:
AbstractVariant
A variant that can hold multiple values at once.
- exception spack.variant.MultipleValuesInExclusiveVariantError(variant, pkg)[source]
Bases:
SpecError
,ValueError
Raised when multiple values are present in a variant that wants only one.
- class spack.variant.SingleValuedVariant(name, value, propagate=False)[source]
Bases:
AbstractVariant
A variant that can hold multiple values, but one at a time.
- compatible(other)[source]
Returns True if self and other are compatible, False otherwise.
As there is no semantic check, two VariantSpec are compatible if either they contain the same value or they are both multi-valued.
- Parameters:
other – instance against which we test compatibility
- Returns:
True or False
- Return type:
- constrain(other)[source]
Modify self to match all the constraints for other if both instances are multi-valued. Returns True if self changed, False otherwise.
- Parameters:
other – instance against which we constrain self
- Returns:
True or False
- Return type:
- intersects(other)[source]
Returns True if there are variant matching both self and other, False otherwise.
- exception spack.variant.UnknownVariantError(spec, variants)[source]
Bases:
SpecError
Raised when an unknown variant occurs in a spec.
- exception spack.variant.UnsatisfiableVariantSpecError(provided, required)[source]
Bases:
UnsatisfiableSpecError
Raised when a spec variant conflicts with package constraints.
- class spack.variant.Value(value, when)[source]
Bases:
object
Conditional value that might be used in variants.
- class spack.variant.Variant(name, default, description, values=(True, False), multi=False, validator=None, sticky=False)[source]
Bases:
object
Represents a variant in a package, as declared in the variant directive.
- property allowed_values
Returns a string representation of the allowed values for printing purposes
- Returns:
representation of the allowed values
- Return type:
- make_default()[source]
Factory that creates a variant holding the default value.
- Returns:
instance of the proper variant
- Return type:
MultiValuedVariant or SingleValuedVariant or BoolValuedVariant
- make_variant(value)[source]
Factory that creates a variant holding the value passed as a parameter.
- Parameters:
value – value that will be hold by the variant
- Returns:
instance of the proper variant
- Return type:
MultiValuedVariant or SingleValuedVariant or BoolValuedVariant
- validate_or_raise(vspec, pkg_cls=None)[source]
Validate a variant spec against this package variant. Raises an exception if any error is found.
- Parameters:
vspec (Variant) – instance to be validated
pkg_cls (spack.package_base.PackageBase) – the package class that required the validation, if available
- Raises:
InconsistentValidationError – if
vspec.name != self.name
MultipleValuesInExclusiveVariantError – if
vspec
has multiple values butself.multi == False
InvalidVariantValueError – if
vspec.value
contains invalid values
- property variant_cls
Proper variant class to be used for this configuration.
- class spack.variant.VariantMap(spec)[source]
Bases:
HashableMap
Map containing variant instances. New values can be added only if the key is not already present.
- property concrete
Returns True if the spec is concrete in terms of variants.
- Returns:
True or False
- Return type:
- constrain(other)[source]
Add all variants in other that aren’t in self to self. Also constrain all multi-valued variants that are already present. Return True if self changed, False otherwise
- Parameters:
other (VariantMap) – instance against which we constrain self
- Returns:
True or False
- Return type:
- copy()[source]
Return an instance of VariantMap equivalent to self.
- Returns:
a copy of self
- Return type:
- dict
- spack.variant.any_combination_of(*values)[source]
Multi-valued variant that allows any combination of the specified values, and also allows the user to specify ‘none’ (as a string) to choose none of them.
It is up to the package implementation to handle the value ‘none’ specially, if at all.
- Parameters:
*values – allowed variant values
- Returns:
a properly initialized instance of DisjointSetsOfValues
- spack.variant.auto_or_any_combination_of(*values)[source]
Multi-valued variant that allows any combination of a set of values (but not the empty set) or ‘auto’.
- Parameters:
*values – allowed variant values
- Returns:
a properly initialized instance of DisjointSetsOfValues
- spack.variant.conditional(*values, **kwargs)[source]
Conditional values that can be used in variant declarations.
- spack.variant.disjoint_sets(*sets)[source]
Multi-valued variant that allows any combination picking from one of multiple disjoint sets of values, and also allows the user to specify ‘none’ (as a string) to choose none of them.
It is up to the package implementation to handle the value ‘none’ specially, if at all.
- Parameters:
*sets –
- Returns:
a properly initialized instance of DisjointSetsOfValues
- spack.variant.implicit_variant_conversion(method)[source]
Converts other to type(self) and calls method(self, other)
- Parameters:
method – any predicate method that takes another variant as an argument
Returns: decorated method
- spack.variant.substitute_abstract_variants(spec)[source]
Uses the information in spec.package to turn any variant that needs it into a SingleValuedVariant.
This method is best effort. All variants that can be substituted will be substituted before any error is raised.
- Parameters:
spec – spec on which to operate the substitution
spack.verify module
spack.version module
This module implements Version and version-ish objects. These are:
StandardVersion: A single version of a package. ClosedOpenRange: A range of versions of a package. VersionList: A ordered list of Version and VersionRange elements.
The set of Version and ClosedOpenRange is totally ordered wiht < defined as Version(x) < VersionRange(Version(y), Version(x)) if Version(x) <= Version(y).
- class spack.version.ClosedOpenRange(lo: StandardVersion, hi: StandardVersion)[source]
Bases:
object
- classmethod from_version_range(lo: StandardVersion, hi: StandardVersion)[source]
Construct ClosedOpenRange from lo:hi range.
- intersection(other: ClosedOpenRange | ConcreteVersion)[source]
- intersects(other: ConcreteVersion | ClosedOpenRange | VersionList)[source]
- overlaps(other: ClosedOpenRange | ConcreteVersion | VersionList) bool [source]
- satisfies(other: ClosedOpenRange | ConcreteVersion | VersionList)[source]
- union(other: ClosedOpenRange | ConcreteVersion | VersionList)[source]
- class spack.version.CommitLookup(pkg_name)[source]
Bases:
object
An object for cached lookups of git commits
CommitLookup objects delegate to the misc_cache for locking. CommitLookup objects may be attached to a GitVersion to allow for comparisons between git refs and versions as represented by tags in the git repository.
- property cache_key
- property cache_path
- property fetcher
- lookup_ref(ref) Tuple[str | None, int] [source]
Lookup the previous version and distance for a given commit.
We use git to compare the known versions from package to the git tags, as well as any git tags that are SEMVER versions, and find the latest known version prior to the commit, as well as the distance from that version to the commit in the git repo. Those values are used to compare Version objects.
- property pkg
- property repository_uri
Identifier for git repos used within the repo and metadata caches.
- class spack.version.GitVersion(string: str)[source]
Bases:
ConcreteVersion
Class to represent versions interpreted from git refs.
There are two distinct categories of git versions:
GitVersions instantiated with an associated reference version (e.g. ‘git.foo=1.2’)
GitVersions requiring commit lookups
Git ref versions that are not paired with a known version are handled separately from all other version comparisons. When Spack identifies a git ref version, it associates a
CommitLookup
object with the version. This object handles caching of information from the git repo. When executing comparisons with a git ref version, Spack queries theCommitLookup
for the most recent version previous to this git ref, as well as the distance between them expressed as a number of commits. If the previous version isX.Y.Z
and the distance isD
, the git commit version is represented by the tuple(X, Y, Z, '', D)
. The component''
cannot be parsed as part of any valid version, but is a valid component. This allows a git ref version to be less than (older than) every Version newer than its previous version, but still newer than its previous version.To find the previous version from a git ref version, Spack queries the git repo for its tags. Any tag that matches a version known to Spack is associated with that version, as is any tag that is a known version prepended with the character
v
(i.e., a tagv1.0
is associated with the known version1.0
). Additionally, any tag that represents a semver version (X.Y.Z with X, Y, Z all integers) is associated with the version it represents, even if that version is not known to Spack. Each tag is then queried in git to see whether it is an ancestor of the git ref in question, and if so the distance between the two. The previous version is the version that is an ancestor with the least distance from the git ref in question.This procedure can be circumvented if the user supplies a known version to associate with the GitVersion (e.g.
[hash]=develop
). If the user prescribes the version then there is no need to do a lookup and the standard version comparison operations are sufficient.- attach_git_lookup_from_package(pkg_name)[source]
Use the git fetcher to look up a version for a commit.
Since we want to optimize the clone and lookup, we do the clone once and store it in the user specified git repository cache. We also need context of the package to get known versions, which could be tags if they are linked to Git Releases. If we are unable to determine the context of the version, we cannot continue. This implementation is alongside the GitFetcher because eventually the git repos cache will be one and the same with the source cache.
- property dashed: StandardVersion
- property dotted: StandardVersion
- has_git_prefix
- property joined: StandardVersion
- ref
- property ref_lookup
- property ref_version: StandardVersion
- satisfies(other: GitVersion | StandardVersion | ClosedOpenRange | VersionList)[source]
- property underscored: StandardVersion
- up_to(index) StandardVersion [source]
- class spack.version.StandardVersion(string: str | None, version: tuple, separators: tuple)[source]
Bases:
ConcreteVersion
Class to represent versions
- property dashed
The dashed representation of the version.
Example: >>> version = Version(‘1.2.3b’) >>> version.dashed Version(‘1-2-3b’)
- Returns:
The version with separator characters replaced by dashes
- Return type:
- property dotted
The dotted representation of the version.
Example: >>> version = Version(‘1-2-3b’) >>> version.dotted Version(‘1.2.3b’)
- Returns:
The version with separator characters replaced by dots
- Return type:
- intersection(other: ClosedOpenRange | StandardVersion)[source]
- intersects(other: StandardVersion | GitVersion | ClosedOpenRange) bool [source]
- property joined
The joined representation of the version.
Example: >>> version = Version(‘1.2.3b’) >>> version.joined Version(‘123b’)
- Returns:
The version with separator characters removed
- Return type:
- satisfies(other: ClosedOpenRange | StandardVersion | GitVersion | VersionList) bool [source]
- separators
- string
- property underscored
The underscored representation of the version.
Example: >>> version = Version(‘1.2.3b’) >>> version.underscored Version(‘1_2_3b’)
- Returns:
- The version with separator characters replaced by
underscores
- Return type:
- union(other: ClosedOpenRange | StandardVersion)[source]
- up_to(index)[source]
The version up to the specified component.
Examples: >>> version = Version(‘1.23-4b’) >>> version.up_to(1) Version(‘1’) >>> version.up_to(2) Version(‘1.23’) >>> version.up_to(3) Version(‘1.23-4’) >>> version.up_to(4) Version(‘1.23-4b’) >>> version.up_to(-1) Version(‘1.23-4’) >>> version.up_to(-2) Version(‘1.23’) >>> version.up_to(-3) Version(‘1’)
- Returns:
The first index components of the version
- Return type:
- version
- spack.version.Version(string: str | int) GitVersion | StandardVersion [source]
- exception spack.version.VersionChecksumError(message, long_message=None)[source]
Bases:
VersionError
Raised for version checksum errors.
- exception spack.version.VersionError(message, long_message=None)[source]
Bases:
SpackError
This is raised when something is wrong with a version.
- class spack.version.VersionList(vlist=None)[source]
Bases:
object
Sorted, non-redundant list of Version and ClosedOpenRange elements.
- property concrete: ConcreteVersion | None
- property concrete_range_as_version: ConcreteVersion | None
Like concrete, but collapses VersionRange(x, x) to Version(x). This is just for compatibility with old Spack.
- highest() StandardVersion | None [source]
Get the highest version in the list.
- highest_numeric() StandardVersion | None [source]
Get the highest numeric version in the list.
- intersect(other) bool [source]
Intersect this spec’s list with other.
Return True if the spec changed as a result; False otherwise
- intersection(other: VersionList) VersionList [source]
- lowest() StandardVersion | None [source]
Get the lowest version in the list.
- preferred() StandardVersion | None [source]
Get the preferred (latest) version in the list.
- union(other: VersionList)[source]
- update(other: VersionList)[source]
- exception spack.version.VersionLookupError(message, long_message=None)[source]
Bases:
VersionError
Raised for errors looking up git commits as versions.
- spack.version.VersionRange(lo: str | StandardVersion, hi: str | StandardVersion)[source]
- spack.version.any_version: VersionList = [:]
This version contains all possible versions.
- spack.version.from_string(string) VersionList | ClosedOpenRange | StandardVersion | GitVersion [source]
Converts a string to a version object. This is private. Client code should use ver().
- spack.version.next_version(v: StandardVersion) StandardVersion [source]
- spack.version.next_version_str_component(v: VersionStrComponent) VersionStrComponent [source]
Produce the next VersionStrComponent, where masteq -> mastes master -> main
- spack.version.prev_version(v: StandardVersion) StandardVersion [source]
- spack.version.prev_version_str_component(v: VersionStrComponent) VersionStrComponent [source]
Produce the previous VersionStrComponent, where mastes -> masteq master -> head
- spack.version.ver(obj) VersionList | ClosedOpenRange | StandardVersion | GitVersion [source]
Parses a Version, VersionRange, or VersionList from a string or list of strings.