543
I Use This!
Activity Not Available

News

Analyzed about 1 year ago. based on code collected about 1 year ago.
Posted about 11 years ago by Marcus Hanwell
The first workshop on "Sustainable Software for Science: Practice and Experience," was held at the Supercomputing Conference in Denver, CO on November 17, 2013. This meeting was organized by the Software Sustainability Institute at the University of ... [More] Edinburgh and the National Science Foundation to examine how we can create sustainable software platforms that can best serve the needs of scientific research. Prior to the workshop those who planned to attend were asked to contribute articles, distributed under an open license, to be submitted to a service that issues a digital object identifier (DOI), such as arXiv or Figshare. We were given a strict limit of four pages and a deadline for submissions. At Kitware we put together an article on "Sustainable Software Ecosystems for Open Science." In it, we (briefly) outlined fifteen years of practice and experience at Kitware in developing open source software for science. The final and complete list of accepted submissions is available here. Each article was reviewed after publication, but before the workshop, and categorized according to the proposed panels at the workshop. Meeting notes were taken as a collaborative effort throughout the workshop including links to the two keynotes. Philip E. Bourne talked about "A Recipe for Sustainable Software", and Arfon Smith, recently recruited by GitHub, spoke about "Scientific Software and the Open Collaborative Web." Both of the keynotes recognized the importance of software for science and the lack of recognition in the current academic climate. Once these keynotes concluded, the panel discussions began, with the contributed papers assigned to each beforehand. It was great to see such a broad variety paper topics. I think that scientific research has come to increasingly rely on software in the last few decades with little to no recognition for the development of software tools. As noted by the keynotes, this needs to change, and funding has to be earmarked to develop important tools. The best way to foster reuse and extension is through the use of permissive open source licenses where shared software platforms can be developed collaboratively across organizational barriers. A paper was published summarizing all accepted submissions and more complete papers will hopefully be authored going forward. Ideally, sustainable positions within academia need to be established, along with viable business models for software companies innovating in areas such as open source software services for scientific software. It was especially appropriate to see this first meeting hosted by the Supercomputing Conference, where open source has played such a critical role, in large part due to licensing issues on supercomputers where per-core fees are charged, and availability of source code enabling researchers to rapidly adapt software for deployment on exoctic architectures. I look forward to further exploring how we can foster sustainable software for science, and adapt reward and funding mechanisms to encourage desirable outcomes. I think greater recognition of open source from both funding agencies and universities is critical here. This article was originally published on opensource.com under a CC-BY license. [Less]
Posted about 11 years ago by Marcus Hanwell
The first workshop on "Sustainable Software for Science: Practice and Experience," was held at the Supercomputing Conference in Denver, CO on November 17, 2013. This meeting was organized by the Software Sustainability Institute at the University of ... [More] Edinburgh and the National Science Foundation to examine how we can create sustainable software platforms that can best serve the needs of scientific research. Prior to the workshop those who planned to attend were asked to contribute articles, distributed under an open license, to be submitted to a service that issues a digital object identifier (DOI), such as arXiv or Figshare. We were given a strict limit of four pages and a deadline for submissions. At Kitware we put together an article on "Sustainable Software Ecosystems for Open Science." In it, we (briefly) outlined fifteen years of practice and experience at Kitware in developing open source software for science. The final and complete list of accepted submissions is available here. Each article was reviewed after publication, but before the workshop, and categorized according to the proposed panels at the workshop. Meeting notes were taken as a collaborative effort throughout the workshop including links to the two keynotes. Philip E. Bourne talked about "A Recipe for Sustainable Software", and Arfon Smith, recently recruited by GitHub, spoke about "Scientific Software and the Open Collaborative Web." Both of the keynotes recognized the importance of software for science and the lack of recognition in the current academic climate. Once these keynotes concluded, the panel discussions began, with the contributed papers assigned to each beforehand. It was great to see such a broad variety paper topics. I think that scientific research has come to increasingly rely on software in the last few decades with little to no recognition for the development of software tools. As noted by the keynotes, this needs to change, and funding has to be earmarked to develop important tools. The best way to foster reuse and extension is through the use of permissive open source licenses where shared software platforms can be developed collaboratively across organizational barriers. A paper was published summarizing all accepted submissions and more complete papers will hopefully be authored going forward. Ideally, sustainable positions within academia need to be established, along with viable business models for software companies innovating in areas such as open source software services for scientific software. It was especially appropriate to see this first meeting hosted by the Supercomputing Conference, where open source has played such a critical role, in large part due to licensing issues on supercomputers where per-core fees are charged, and availability of source code enabling researchers to rapidly adapt software for deployment on exoctic architectures. I look forward to further exploring how we can foster sustainable software for science, and adapt reward and funding mechanisms to encourage desirable outcomes. I think greater recognition of open source from both funding agencies and universities is critical here. This article was originally published on opensource.com under a CC-BY license. [Less]
Posted about 11 years ago by Matt McCormick
The 4.5 release is a major milestone that marks the hard work of many outstanding community members. Among the major contributions in this release are registrations computed with 32-bit floats, a new edition of the Software Guide, the first release ... [More] of the wiki examples tarball, and the first release of the Sphinx examples tarballs.  More details can be found in the release announcement. Links to the tarballs are on the download page. Enjoy ITK! [Less]
Posted about 11 years ago by Robert Maynard
Some serious problems were reported with the 2.8.12 release. Thanks to the swift work of Brad King, Stephen Kelly, Modestas Vainius and Vladislav Vinogradov, those problems have been fixed. We've prepared a 2.8.12.1 bug fix release to address those ... [More] issues. Some of the notable changes in this patch release are: The implementation of new CMake Policy CMP0022 was corrected to preserve default transitive linking by the target_link_libraries plain signature when the policy is set to NEW. As a result, CMake 2.8.12.1 may produce CMP0022 warnings that 2.8.12.0 did not. The Makefile and Ninja generators, when using MSVC from Visual Studio 2013, now pass to "cl" the "/FS" option to support parallel builds. FindCUDA learned of CUDA 5.5's divided NPP libraries The complete list of changes in 2.12.1 can be found at: http://www.cmake.org/Wiki/CMake/ChangeLog [Less]
Posted over 11 years ago by Robert Maynard
On behalf of myself, Ken, Bill, Brad, David, Alex, Eike, Steve, Eric, Zach, Ben and the rest of the CMake team from all around the world, we are pleased to announce that CMake 2.8.12 is now available for download at:   ... [More] http://www.cmake.org/files/v2.8/?C=M;O=D It is also available from the usual download links found on the CMake web site: http://www.cmake.org/cmake/resources/software.html We would like to thank the CMake users who reported issues during the CMake 2.8.12 release cycle. Your contributions are greatly appreciated. Some of the notable changes in this release are: Introduced target_compile_options command Specify compile options to use when compiling a given target. Supports PUBLIC, PRIVATE, and INTERFACE options. PRIVATE and PUBLIC items will populate the COMPILE_OPTIONS property of the target. PUBLIC and INTERFACE items will populate the INTERFACE_COMPILE_OPTIONS property of the target. Supports generator expressions. Introduced add_compile_options command Adds options to the compiler command line for sources in the current directory and below. Supports generator expressions. Introduced CMake Policy CMP0021: It is now an error to add relative paths to the INCLUDE_DIRECTORIES target property. Introduced CMake Policy CMP0022: Target properties matching (IMPORTED_)LINK_INTERFACE_LIBRARIES(_<CONFIG>) are ignored, and will no longer be populated by the target_link_libraries command. It is now an error to populate the properties directly in user code. Instead use the INTERFACE keyword with target_link_libraries, or the target property INTERFACE_LINK_LIBRARIES. Introduced CMake Policy CMP0023:  Plain and keyword target_link_libraries signatures cannot be mixed for a given target when this policy is enabled. Once PUBLIC,PRIVATE, or INTERFACE keywords are used, all subsequent target_link_libraries calls to the target must use one of these keywords. Introduced: Support for RPATH under OSX Please see the blog post by Clinton Stimpson about using RPATH on OSX  (http://www.kitware.com/blog/home/post/510) CMake: New PUBLIC PRIVATE and INTERFACE options for target_link_libraries CMake: New ALIAS targets feature CMake: Automatically process Headers directory of Apple Frameworks as a usage requirement CMake: File command now supports the GENERATE command to produce files at generate time CMake: target_include_directories now supports the SYSTEM parameter CMake: Add support for Java in cross compilation toolchains CMake: Improved support for the IAR toolchain CMake: Improved support for the ARM toolchain under Visual Studio CMake: Improvements to the Visual Studio Generators Including Separate compiler and linker PDB files Support for subdirectory MSBuild projects Support for assembly code to VS10 Support for Windows CE to VS11 CMake: Added COMPILE_OPTIONS target property. CMake: Added INTERFACE_LINK_LIBRARIES added as a property to targets CMake: Now supports .zip files with the tar command CMake: try_compile now supports multiple source files CMake: Optimized custom command dependency lookup CMake: Removal of configured files will retrigger CMake when issuing a build command CMake: Ninja now tracks custom command generated files that aren't listed as output CMake: Added generator expression support for compiler versions CMake: Added support for XCode 5.0 CMake-Gui: Add search functions for Output window CTest: Improved memory checker support FindGTK2: General Improvements FindCUDA: Multiple improvements to the custom commands The bug tracker change log page for this version is at: http://public.kitware.com/Bug/changelog_page.php?version_id=112 The complete list of changes for this release can be found at:  http://www.cmake.org/Wiki/CMake/ChangeLog Thanks [Less]
Posted over 11 years ago by Aashish Chaudhary
VES (http://ves.vtk.org) is the VTK for Embedded Systems. It is a C++ rendering library for mobile devices using OpenGL ES 2.0. VES integrates with the Visualization Toolkit (VTK) to deliver: scientific and medical visualization capabilities ... [More] remote rendering and/or data hosting modern/novel/new/collaborative interaction techniques Recently, we made improvements to building VES (specifically KiwiViewer) for Android on Linux and Mac. The build procedure requires Android NDK  and Android SDK to be installed on the system.  The steps to build VES and KiwiViewer for Android on Linux and Mac are as follows:  1. Make sure to checkout this code from VES stage > git clone git://vtk.org/stage/VES.git  > git checkout improve_ves_android 2. Download Android NDK. The Android NDK r8b is currently the required version. It can be downloaded from: http://dl.google.com/android/ndk/android-ndk-r8b-linux-x86.tar.bz2 http://dl.google.com/android/ndk/android-ndk-r8b-darwin-x86.tar.bz2 3. Download and install Android SDK. For that please refer to the documentation on this URL:  http://developer.android.com/sdk/installing/index.html  4. Download Android SDK and ADT plugin for Eclipse. More information on this can be found here: http://developer.android.com/sdk/installing/installing-adt.html  5. Open Android SDK Manager and install  Tools > Android SDK Tools Tools > Android SDK Platform-tools  Tools > Android SDK Build-tools  Android 4.3 (API 18) > SDK Platform ARM EABI v7a System Image 6. Build VES export ANDROID_NDK="path/to/ndk/root" mkdir build  cd build cmake -DVES_ANDROID_SUPERBUILD:BOOL=TRUE "$ves_src/Apps/Android/CMakeBuild" build (make -j8 for instance) 7. Build KiwiViewer cmake \                -G "Eclipse CDT4 - Unix Makefiles" \              -DCMAKE_TOOLCHAIN_FILE="$ves_src/CMake/toolchains/android.toolchain.cmake" \              -DANDROID_NATIVE_API_LEVEL=8 \              -DANDROID_TARGET=android-18 \              -DANDROID_EXECUTABLE="path/to/android" \              -DVTK_DIR="path/to/cross/build/CMakeExternals/Build/vtk-android" \               -DVES_DIR="path/to/cross/build/CMakeExternals/Build/ves-android" \               "$ves_src/Apps/Android/Kiwi" build (make -j8 for instance) 8. Create new project in Eclipse  9.  Import it in Eclipse 10. Hit Finish. The project should look as show below: 11. Run it as any Android application.  That's it!. We are hoping to make it work on Windows OS as well. Also, if you have different version of Android NDK or SDK then the above procedure may not work.  We are in the process of making this work available in VES master. Also, we will have fixes for Windows too in near future.  Happy building!  -------- VES is an open source project. You can post your questions to [email protected]. For commercial and technical support, please contact us at [email protected].       [Less]
Posted over 11 years ago by Robert Maynard
The CMake 2.8.12 release candidate stream continues! Thanks to numerous CMake users we uncovered a couple of critical regressions in RC3. We expect that RC4 will be the final RC unless a new critical, must-fix regression is discovered. You can find ... [More] the source and binaries here: http://www.cmake.org/files/v2.8/?C=M;O=D Some of the notable changes in this release are: Introduced target_compile_options command Specify compile options to use when compiling a given target. Supports PUBLIC, PRIVATE, and INTERFACE options. PRIVATE and PUBLIC items will populate the COMPILE_OPTIONS property of the target. PUBLIC and INTERFACE items will populate the INTERFACE_COMPILE_OPTIONS property of the target. Supports generator expressions. Introduced add_compile_options command Adds options to the compiler command line for sources in the current directory and below. Supports generator expressions. Introduced CMake Policy CMP0021: It is now an error to add relative paths to the INCLUDE_DIRECTORIES target property. Introduced CMake Policy CMP0022: Target properties matching (IMPORTED_)LINK_INTERFACE_LIBRARIES(_<CONFIG>) are ignored, and will no longer be populated by the target_link_libraries command. It is now an error to populate the properties directly in user code. Instead use the INTERFACE keyword with target_link_libraries, or the target property INTERFACE_LINK_LIBRARIES. Introduced CMake Policy CMP0023:  Plain and keyword target_link_libraries signatures cannot be mixed for a given target when this policy is enabled. Once PUBLIC,PRIVATE, or INTERFACE keywords are used, all subsequent target_link_libraries calls to the target must use one of these keywords. Introduced: Support for RPATH under OSX Please see the blog post by Clinton Stimpson about using RPATH on OSX  (http://www.kitware.com/blog/home/post/510) CMake: New PUBLIC PRIVATE and INTERFACE options for target_link_libraries CMake: New ALIAS targets feature CMake: Automatically process Headers directory of Apple Frameworks as a usage requirement CMake: File command now supports the GENERATE command to produce files at generate time CMake: target_include_directories now supports the SYSTEM parameter CMake: Add support for Java in cross compilation toolchains CMake: Improved support for the IAR toolchain CMake: Improved support for the ARM toolchain under Visual Studio CMake: Improvements to the Visual Studio Generators Including Separate compiler and linker PDB files Support for subdirectory MSBuild projects Support for assembly code to VS10 Support for Windows CE to VS11 CMake: Added COMPILE_OPTIONS target property. CMake: Added INTERFACE_LINK_LIBRARIES added as a property to targets CMake: Now supports .zip files with the tar command CMake: try_compile now supports multiple source files CMake: Optimized custom command dependency lookup CMake: Removal of configured files will retrigger CMake when issuing a build command CMake: Ninja now tracks custom command generated files that aren't listed as output CMake: Added generator expression support for compiler versions CMake-Gui: Add search functions for Output window CTest: Improved memory checker support FindGTK2: General Improvements FindCUDA: Multiple improvements to the custom commands The bug tracker change log page for this version is at: http://public.kitware.com/Bug/changelog_page.php?version_id=112 The complete list of changes in RC4 since RC3 can be found at:  http://www.cmake.org/Wiki/CMake/ChangeLog As this is expected to be the last RC release please test it and report any regressions to the list or the bug tracker. Thanks     [Less]
Posted over 11 years ago by Matt McCormick
For effective software development, the edit-compile-test cycle should run quickly.  In order to get a better understanding of ITK's test times, we created a Python script that visualizes CTest test times with VTK's TreeMapView. ITK test times. ... [More] Tests that run longer have a proportionally larger area.  With the visualization, it is clear that there are some tests that take significantly longer than most others.  It is possible to zoom in and zoom out on tests and identify their names with a mouse hover or click. After identifying the long running tests, we added a CTest label to them.  ITK tests without RUNS_LONG label (top) and with RUNS_LONG label (bottom). Tests with the RUNS_LONG label constituted less than 1% of the tests, but they take approximately half of the time to run the test suite! It is possible to also organize the visualization by ITK module label. ITK test times organized by module. Most of the long running tests identified were integration tests, as opposed to unit tests, and tests for advanced algorithms on semi-realistic data.  The decrease in code coverage after excluding the tests was 0.25%.  To increase the quickness of the edit-build-test cycle, it is possible to exclude the long running tests locally with   ctest -LE RUNS_LONG For dashboard build platforms that take so long to run that their maintainers may prefer not to run them at all (we're looking at you, Visual Studio in Debug!), the tests can be excluded with    set(CTEST_TEST_ARGS EXCLUDE_LABEL RUNS_LONG) Of course, these approaches should only be taken after it is determined that it is not feasible to refactor the code or the tests so they run more quickly. The script is generally applicable on any system that has VTK Python wrapping installed and uses CTest.  For example, screen captures of VTK and CMake test suites are shown below. VTK test times.   CMake test times. [Less]
Posted over 11 years ago by Xiaoxiao Liu
1.Background and Introduction The Insight Toolkit (ITK) was officially modularized in version 4.0.0, released in Dec. 2011 [1,2,7]. Developers have been testing and improving the modular structure since then. At present, the toolkit contains 137 ... [More] regular/internal modules [5] and four Remote modules [3]. Here we will provide step-by-step instructions for ITK community members to take advantage of the modular structure when building a customized ITK library or using it as a third party library in ITK applications. Finally, we will introduce the External and Remote module support, which makes extending the toolkit easier. *The following instructions apply to ITK 4.5.0 and later. 2. Build ITK via CMake GUI The most common way to build ITK is via the CMake GUI. There are three main approaches to selecting the modules for building the library: the default mode, the group mode, and the advanced module mode. 2.1 Default Mode By default, most internal ITK modules, except those that depend on external third party libraries (such as ITKVtkGlue, ITKBridgeOpenCV, ITKBridgeVXL, etc. ) and those with legacy code (ITKReview, ITKDeprecated and ITKv3Compatibility), will be built into the ITK library. All non-default modules, which includes the remote modules, have "EXCLUDE_FROM_DEFAULT" tags in their module definition files (itk-module.cmake). ITK_BUILD_DEFAULT_MODULES is the CMake option to request that all of these default modules be built, and by default this option is ON (see Figure 1). Figure 1. CMake GUI for configuring ITK in the default mode: after the first Configure In the advanced mode of the CMake GUI, you can manually toggle the non-default modules via the Module_{module name} options. All default modules' Module_{module name} options are not available for toggle since they are already requested and enabled via ITK_BUILD_DEFAULT_MODULES=ON (see Figure 2). Figure 2. CMake GUI for configuring ITK in the default mode: after the second Configure, in the advance mode it shows options for non-default ITK Modules When ITK_BUILD_DEFAULT_MODULES is toggled to be OFF, users can then customize the list of default modules to be included in the ITK library, using the the Group (section 2.2) and Module (section 2.3) options. 2.2 Group Mode The CMake variables ITKGroup_{group name} are visible to toggle when ITK_BUILD_DEFAULT_MODULES=OFF. ITK is organized so that groups of modules that have close relationships or similar functionalities reside in the same directory. Presently, there are 11 groups (not including the External and Remote groups). The CMake ITKGroup_{group name} option is created for the convenience of enabling or disabling multiple modules at once. The Core group is turned on by default (see Figure 3). Figure 3. CMake GUI for configuring ITK in the Group Mode When a group is ON, all the modules in that group and their (recursively) module dependencies are enabled. When a group is turned OFF, the modules in the group, except the ones that are required by other enabled modules, default to be OFF. 2.3 Advanced Module Mode If you are not sure about which groups to turn on, but you do have a list of specific modules to be included in your ITK library, you can certainly skip the Group options and use the Module_{module name} options only. For whichever modules are selected, their (recursively) dependent modules are automatically enabled. However, not all the modules will show up in the CMake GUI for toggling, due to the various levels of controls in the previous two modes. If they are already enabled by other options or depended upon by other modules, they will be hidden from the GUI. For example, Module_{ITKFoo} option is hidden for toggle when the module ITKFoo itself is enabled either by 1) module dependencies: ITKBar depends on ITKFoo, or by 2) the ITKGroup_{FooAndBar} option: ITKFoo belongs to the group, or by 3) ITK_BUILD_DEFAULT_MODULES=ON and ITKFoo is a default module. To find out why a particular module is enabled, check the CMake configuration messages where the module enablement information is displayed (Figure 1). The enablement messages are sorted in alphabetical order by module names. 3. Build ITK via Command Line For those who prefer to build ITK via the command line syntax, here are some usage examples: Example 1: All default modules The default is ITK_BUILD_DEFULT_MODULES=ON, so it is optional to have this -D argument. Example 2: Enable specific group(s) of modules BUILD_EXAMPLES=ON requires ITK_BUILD_DEFAULT_MODULES=ON, therefore it is necessary to add -DBUILD_EXAMPLES=OFF to the command line. Example 3: Enable specific modules 4. ITK Applications and Superbuild For a project that uses ITK as an external library, it is recommended to specify the individual desired ITK modules in the COMPONENTS argument of the find_package CMake command: If you would like to use the CMake ExternalProject module to download ITK source code when building your ITK application (a.k.a. superbuild ITK), here is a basic CMake snippet for setting up a superbuild in an ITKv4 application project using CMake 2.8.7+: More exemple configurations for superbuild ITK projects can be found in Slicer, BrainsTools, ITK Wiki Examples, and ITK Sphinx Examples. 5. Extend ITK with External and Remote modules The modularization effort has significantly improved the extensibility of the toolkit and lowered the barrier for contributions. An External module [4] is distributed outside the ITK main repository, but it could be built into ITK as a module once downloaded into the local copy of ITK source tree. When developing a new ITK module, place it into the Modules/External directory in a local ITK source tree to include it into your ITK library. The next step in the module contribution process is to submit the External module to Insight Journal for review. Once the External module passes dashboard testing and peer review,  it can be made into a Remote module [3], which will be disseminated  from ITK itself while allowing its source code to be distributed independently of the main ITK repository. To be more specific, the source code of a Remote module can be downloaded by CMake (with a CMake variable switch) at ITK CMake configuration time, making it a convenient way to distribute modular source code without increasing the size of the main repository. After the Remote Module has experienced sufficient testing, and community members express broad interests in the contribution, the submitter can then move the contribution into the ITK repository via Gerrit code review. It is possible but not recommended to directly submit a module to Gerrit for review without submitting to Insight Journal first. There are no established rules on whether a module should be directly merged into the main repository or not. But a nice IJ article describing the functionalities and the designs behind the module  is a great way to provide extra documentations on the code and could serve as good ITK Software Guide material in the future.  Figure 4 illustrates the module contribution channels. Figure 4. Contribute a module to ITK 6. Summary and Discussion For most ITK consumers/users, the modularization of the toolkit is relatively transparent. The default configuration includes all the (default) modules into the ITK library, which is used to build their own ITK applications; For ITK developers and more advanced users, the modular structure has changed the rules for organizing the source code, building the library and contributing to ITK community. The Remote Module infrastructure enables fast dissemination of research code through ITK without increasing the size of the main repository. The Insight Journal also added supports for ITK module submissions with automatic dashboard testing. The VTK community had followed closely with the modularization progress in ITK, and VTK had been equipped with the same internal modular architecture with slightly different configuration options [6]. The adaptation to the new modular architecture takes time. Hopefully this blog provides some useful guidance to assist this transition for our community members. Please let us know if you find problems with the instructions, or if you have questions or suggestions. Your comments are always  welcomed. References: [1] Wiki page for ITK modularization: http://www.itk.org/Wiki/ITK/Release_4/Modularization [2] How to organize your ITK code into a ITK module: http://insightsoftwareconsortium.github.io/ITKBarCamp-doc/ITK/ConstructITKModule/index.html [3] ITK remote modules: http://www.itk.org/Wiki/ITK/Policy_and_Procedures_for_Adding_Remote_Modules [4] ITK external modules: http://www.itk.org/Wiki/ITK/Release_4/Modularization/Add_an_external_module_(external_module) [5] Doxygen page for ITK modules: http://www.itk.org/Doxygen/html/modules.html [6] VTK Modularization and Modernization: http://www.kitware.com/blog/home/post/496 [7] ITK modularization: Divide and Conquer: http://www.kitware.com/source/home/post/34 [Less]
Posted over 11 years ago by Xiaoxiao Liu
1.Background and Introduction The Insight Toolkit (ITK) was officially modularized in version 4.0.0, released in Dec. 2011 [1,2,7]. Developers have been testing and improving the modular structure since then. At present, the toolkit contains 137 ... [More] regular/internal modules [5] and four Remote modules [3]. Here we will provide step-by-step instructions for ITK community members to take advantage of the modular structure when building a customized ITK library or using it as a third party library in ITK applications. Finally, we will introduce the External and Remote module support, which makes extending the toolkit easier. *The following instructions apply to ITK 4.5.0 and later. 2. Build ITK via CMake GUI The most common way to build ITK is via the CMake GUI. There are three main approaches to selecting the modules for building the library: the default mode, the group mode, and the advanced module mode. 2.1 Default Mode By default, most internal ITK modules, except those that depend on external third party libraries (such as ITKVtkGlue, ITKBridgeOpenCV, ITKBridgeVXL, etc. ) and those with legacy code (ITKReview, ITKDeprecated and ITKv3Compatibility), will be built into the ITK library. All non-default modules, which includes the remote modules, have "EXCLUDE_FROM_DEFAULT" tags in their module definition files (itk-module.cmake). ITK_BUILD_DEFAULT_MODULES is the CMake option to request that all of these default modules be built, and by default this option is ON (see Figure 1). Figure 1. CMake GUI for configuring ITK in the default mode: after the first Configure In the advanced mode of the CMake GUI, you can manually toggle the non-default modules via the Module_{module name} options. The Module_{module name} options of the default modules are not available for toggle since they are already requested and enabled via ITK_BUILD_DEFAULT_MODULES=ON (see Figure 2). Figure 2. CMake GUI for configuring ITK in the default mode: after the second Configure, in the advance mode it shows options for non-default ITK Modules When ITK_BUILD_DEFAULT_MODULES is toggled to be OFF, users can then customize the list of default modules to be included in the ITK library, using the the Group (section 2.2) and Module (section 2.3) options. 2.2 Group Mode The CMake variables ITKGroup_{group name} are visible to toggle when ITK_BUILD_DEFAULT_MODULES=OFF. ITK is organized so that groups of modules that have close relationships or similar functionalities reside in the same directory. Presently, there are 11 groups (not including the External and Remote groups). The CMake ITKGroup_{group name} option is created for the convenience of enabling or disabling multiple modules at once. The Core group is turned on by default (see Figure 3). Figure 3. CMake GUI for configuring ITK in the Group Mode When a group is ON, all the modules in that group and their (recursively) module dependencies are enabled. When a group is turned OFF, the modules in the group, except the ones that are required by other enabled modules, default to be OFF. 2.3 Advanced Module Mode If you are not sure about which groups to turn on, but you do have a list of specific modules to be included in your ITK library, you can certainly skip the Group options and use the Module_{module name} options only. For whichever modules are selected, their (recursively) dependent modules are automatically enabled. However, not all the modules will show up in the CMake GUI for toggling, due to the various levels of controls in the previous two modes. If they are already enabled by other options or other modules, they will be hidden from the GUI. For example, Module_{ITKFoo} option is hidden for toggle when the module ITKFoo itself is enabled either by 1) module dependencies: ITKBar depends on ITKFoo, or by 2) the ITKGroup_{FooAndBar} option: ITKFoo belongs to the group, or by 3) ITK_BUILD_DEFAULT_MODULES=ON and ITKFoo is a default module. To find out why a particular module is enabled, check the CMake configuration messages where the module enablement information is displayed (Figure 1). The enablement messages are sorted in alphabetical order by module names. 3. Build ITK via Command Line For those who prefer to build ITK via the command line syntax, here are some usage examples: Example 1: All default modules The default is ITK_BUILD_DEFULT_MODULES=ON, so it is optional to have this -D argument. Example 2: Enable specific group(s) of modules BUILD_EXAMPLES=ON requires ITK_BUILD_DEFAULT_MODULES=ON, therefore it is necessary to add -DBUILD_EXAMPLES=OFF to the command line. Example 3: Enable specific modules 4. ITK Applications and Superbuild For a project that uses ITK as an external library, it is recommended to specify the individual desired ITK modules in the COMPONENTS argument of the find_package CMake command: If you would like to use the CMake ExternalProject module to download ITK source code when building your ITK application (a.k.a. superbuild ITK), here is a basic CMake snippet for setting up a superbuild in an ITKv4 application project using CMake 2.8.7+: More exemple configurations for superbuild ITK projects can be found in Slicer, BrainsTools, ITK Wiki Examples, and ITK Sphinx Examples. 5. Extend ITK with External and Remote modules The modularization effort has significantly improved the extensibility of the toolkit and lowered the barrier for contributions. An External module [4] is distributed outside the ITK main repository, but it could be built into ITK as a module once downloaded into the local copy of ITK source tree. When developing a new ITK module, place it into the Modules/External directory in a local ITK source tree to include it into your ITK library. The next step in the module contribution process is to submit the External module to Insight Journal for review. Once the External module passes dashboard testing and peer review,  it can be made into a Remote module [3], which will be disseminated  from ITK itself while allowing its source code to be distributed independently of the main ITK repository. To be more specific, the source code of a Remote module can be downloaded by CMake (with a CMake variable switch) at ITK CMake configuration time, making it a convenient way to distribute modular source code without increasing the size of the main repository. After the Remote Module has experienced sufficient testing, and community members express broad interests in the contribution, the submitter can then move the contribution into the ITK repository via Gerrit code review. It is possible but not recommended to directly submit a module to Gerrit for review without submitting to Insight Journal first. There are no established rules on whether a module should be directly merged into the main repository or not. But a nice IJ article describing the functionalities and the designs behind the module  is a great way to provide extra documentations on the code and could serve as good ITK Software Guide material in the future.  Figure 4 illustrates the module contribution channels. Figure 4. Contribute a module to ITK 6. Summary and Discussion For most ITK consumers/users, the modularization of the toolkit is relatively transparent. The default configuration includes all the (default) modules into the ITK library, which is used to build their own ITK applications; For ITK developers and more advanced users, the modular structure has changed the rules for organizing the source code, building the library and contributing to ITK community. The Remote Module infrastructure enables fast dissemination of research code through ITK without increasing the size of the main repository. The Insight Journal also added supports for ITK module submissions with automatic dashboard testing. The VTK community had followed closely with the modularization progress in ITK, and VTK had been equipped with the same internal modular architecture with slightly different configuration options [6]. The adaptation to the new modular architecture takes time. Hopefully this blog provides some useful guidance to assist this transition for our community members. Please let us know if you find problems with the instructions, or if you have questions or suggestions. Your comments are always  welcomed. References: [1] Wiki page for ITK modularization: http://www.itk.org/Wiki/ITK/Release_4/Modularization [2] How to organize your ITK code into a ITK module: http://insightsoftwareconsortium.github.io/ITKBarCamp-doc/ITK/ConstructITKModule/index.html [3] ITK remote modules: http://www.itk.org/Wiki/ITK/Policy_and_Procedures_for_Adding_Remote_Modules [4] ITK external modules: http://www.itk.org/Wiki/ITK/Release_4/Modularization/Add_an_external_module_(external_module) [5] Doxygen page for ITK modules: http://www.itk.org/Doxygen/html/modules.html [6] VTK Modularization and Modernization: http://www.kitware.com/blog/home/post/496 [7] ITK modularization: Divide and Conquer: http://www.kitware.com/source/home/post/34 [Less]