Jubula User Manual


Table of Contents
1. Introduction
1.1. Introduction to this chapter
1.2. Comparison to other testing approaches
1.2.1. Introduction to the different testing approaches
1.2.2. Manual Tests
1.2.3. Programmed Tests
1.2.4. Recorded Tests
1.2.5. Our approach
1.2.5.1. Early test creation
1.2.5.2. Code-free automation
1.2.5.3. Manual tester intelligence
1.3. How to read this manual
1.3.1. About the help system
1.3.2. Layout
1.3.3. Conventions Used
1.3.3.1. Typesetting Conventions
2. Samples: example tests
2.1. Introduction to this chapter
2.2. Accessing the prepared Project
2.2.1. Importing and loading the prepared Project
2.2.2. Result Reports
2.3. The structure of the example Project
2.3.1. Introduction to the example project
2.3.2. The reused Projects
2.3.3. The categories
2.4. Adder Tests
2.4.1. Sample 1: using the Swing Simple Adder
2.4.1.1. Sample 1.1: creating a Test Case from Test Steps
2.4.1.2. Sample 1.2: creating a Test Case using the library
2.4.1.3. Sample 1.3: using Event Handlers
2.4.2. Sample 2: using the SWT Simple Adder
2.4.2.1. Sample 2: Simple Adder SWT Test
2.4.3. Sample 3: using the HTML Simple Adder
2.4.3.1. Sample 3: HTML test with the library
2.4.3.2. Sample 3.2: HTML test with multiple data sets
2.4.4. Sample 4: using the JavaFX Simple Adder
2.4.4.1. Sample 4: Simple Adder JavaFX test using library
2.5. DVD Tool Tests
2.5.1. Sample 2.1: testing the menu bar and dialog boxes
2.5.2. Sample 2.2: testing trees
2.5.3. Sample 2.3: testing tables
2.5.4. Sample 2.4: testing tabbed panes, lists, combo boxes
3. Tasks
3.1. Introduction to this chapter
3.2. Starting and connecting to the AUT Agent
3.2.1. Introduction to the AUT agent
3.2.2. Starting the AUT Agent
3.2.2.1. Windows users
3.2.2.2. Linux users
3.2.2.3. Starting the AUT Agent from the command line: options and parameters
3.2.3. Connecting to the AUT Agent
3.3. Starting the Integrated Test Environment (ITE)
3.3.1. Windows Users
3.3.2. Unix Users
3.3.3. Choosing a workspace
3.3.4. Restarting the ITE
3.3.5. Help system
3.3.6. Working with the AUT Agent and client on one machine
3.4. Logging into and switching databases
3.4.1. Logging in to the database
3.4.2. Selecting and changing the database connection
3.5. Migrating to newer versions
3.6. Working with Projects
3.6.1. Introduction to projects
3.6.2. Creating a new Project
3.6.3. Editing the Project and AUT properties
3.6.3.1. Editing general Project properties
3.6.3.2. Changing the toolkit settings for a Project
3.6.3.3. Editing the AUTs in a Project
3.6.3.4. Duplicating AUT configurations
3.6.3.5. Editing the AUT configurations in a Project
3.6.4. Reusing (referencing) whole Projects in a Project
3.6.4.1. Changing the version of a reused Project
3.6.4.2. Searching for deprecated modules
3.6.5. Opening Projects
3.6.5.1. Auto loading a default Project
3.6.6. Refreshing Projects
3.6.7. Deleting Projects
3.6.8. Saving a Project as a new Project
3.6.9. Importing Projects
3.6.9.1. Importing a project through the Menu
3.6.9.2. Importing projects through drag and drop
3.6.10. Exporting Projects
3.6.10.1. Exporting the currently opened Project
3.6.10.2. Exporting all of the Projects from the database
3.6.11. Versioning Projects
3.6.12. Tracking changes in a Project
3.6.12.1. Activating change tracking
3.6.12.2. Removing change tracking information from a Project
3.7. Defining applications under test (AUTs)
3.8. Starting and configuring AUTs
3.8.1. Configuring AUTs to be started from the ITE
3.8.1.1. AUT activation
3.8.2. Basic information required for every AUT configuration
3.8.3. Using a working directory in an AUT configuration
3.8.4. Starting Java AUTs (Swing, SWT/RCP/GEF)
3.8.4.1. Two options to start Java AUTs
3.8.4.2. Configuring a Java AUT to be started from the ITE
3.8.4.3. Basic Java AUT configuration
3.8.4.4. Advanced AUT configuration
3.8.4.5. Expert AUT configuration
3.8.4.6. Starting Java AUTs with the autrun command
3.8.4.7. Creating an AUT definition from a running AUT
3.8.5. Starting JavaFX AUTs
3.8.5.1. Configuring a Java AUT to be started from the ITE
3.8.5.2. Basic JavaFX AUT configuration
3.8.5.3. Advanced JavaFX AUT configuration
3.8.5.4. Expert JavaFX AUT configuration
3.8.6. Starting Web AUTs (HTML)
3.8.6.1. Basic HTML AUT configuration
3.8.6.2. Advanced HTML AUT configuration
3.8.6.3. Expert HTML AUT configuration
3.8.7. Starting other AUTs
3.9. Working with browsers: renaming, deleting, using IDs, multiple browsers
3.9.1. Introduction to browsers
3.9.2. Renaming items in browsers
3.9.3. Deleting items from browsers
3.9.4. Deleting Test Cases with orphans
3.9.5. Working with IDs for Test Cases and Test Suites
3.9.5.1. Copying the ID of a Test Case or Test Suite to the clipboard
3.9.5.2. Opening an element based on an ID in the clipboard
3.9.6. Opening the Test Case Browser multiple times
3.9.7. Opening the task editor for items in browsers
3.10. Working with editors: opening, adding/deleting/renaming items, commenting, adding descriptions, extracting and replacing, reverting changes
3.10.1. Introduction to editors
3.10.2. Opening items in editors
3.10.3. Navigating between Editors
3.10.4. Adding items to editors
3.10.5. Deleting items from editors
3.10.6. Renaming items in editors
3.10.7. Adding comments to items in editors
3.10.8. Adding descriptions to items in editors
3.10.9. Adding Task IDs to items in editors
3.10.10. Commenting out items in editors
3.10.11. Copy and paste items in editors
3.10.12. Extracting Test Cases from editors: Refactoring
3.10.13. Replacing Test Cases in editors: Refactoring
3.10.14. Saving Test Cases from an editor as a new Test Case
3.10.15. Object Mapping Cagegory Association
3.10.16. Reverting changes in an editor
3.11. Working with categories in the browsers and editors
3.11.1. Introduction to categories
3.11.2. Creating a category
3.11.3. Creating Test Cases, Test Suites and Test Jobs in an existing category
3.11.4. Adding comments to categories
3.12. Working with Test Cases
3.12.1. Working with Test Cases
3.12.2. Creating Test Cases
3.12.3. Creating tests from the library of pre-defined Test Cases
3.12.3.1. Using the library to create tests
3.12.3.2. Information about the library
3.12.3.3. Tips and tricks for using the Test Case library
3.12.4. Opening existing Test Cases
3.12.5. Editing Test Cases
3.12.6. Adding and inserting new Test Cases to a Test Case
3.12.7. Moving Test Cases to external Projects
3.12.8. Replacing a specific Test Case at places where it has been reused
3.13. Working with Conditional Statements and Loops
3.13.1. Overview of Conditional Statements and Loops
3.13.2. Conditional Statements (If - Then - Else)
3.13.3. Do - While and While - Do loops
3.13.4. Repeat loops
3.14. Working with test data
3.14.1. Data types and entering data for Test Cases
3.14.2. Entering concrete values as data in Test Cases
3.14.3. Using references for data in Test Cases
3.14.4. Using the edit parameters dialog to add, edit and delete references
3.14.4.1. Add and edit ValueSets for Parameters
3.14.5. Using variables as data for Test Cases
3.14.5.1. Reading and using values (variables) from the AUT
3.14.5.2. Using environment variables in tests
3.14.5.3. Using the pre-defined test execution variables
3.14.6. Using functions as data for Test Cases
3.14.6.1. Syntax for functions
3.14.6.2. Pre-defined functions
3.14.6.3. Embedding functions in other functions
3.14.6.4. Useful examples for functions
3.14.6.5. Adding your own functions
3.14.7. Concatenating (combining) parameters
3.14.8. Viewing and changing data sources for Test Cases
3.14.8.1. Changing the data source for a Test Case
3.14.9. Using central data sets
3.14.9.1. Creating and editing central test data sets
3.14.9.2. Deleting central test data sets
3.14.9.3. Adding and modifying parameters for central test data sets
3.14.9.4. Entering data for central test data sets
3.14.9.5. Reusing central test data sets in Test Cases
3.14.9.6. Importing Excel files as central test data
3.14.9.7. Changing the column used in a central test data set for multiple Test Cases
3.14.10. Using an Excel file as an external data source
3.14.10.1. Configuring the Excel file
3.14.10.2. Using the =TODAY() function in Excel
3.14.11. Using the Data Sets View to enter data loops
3.14.11.1. Data Sets View: adding multiple data sets to a Test Case
3.14.12. Special parameters: empty strings, the escape character, and skipping test steps
3.14.13. Overwriting data for Test Cases and Test Suites
3.15. Working with component names
3.15.1. Introduction to component names
3.15.2. Creating new component names
3.15.3. Entering and reassigning component names in the Component Names View
3.15.4. Renaming component names
3.15.5. Propagating component names
3.15.6. No component type exists message in Component Names View
3.15.7. Merging component names
3.15.8. Deleting unused component names
3.15.9. Understanding the component hierarchy
3.16. Working with Test Suites
3.16.1. Creating a Test Suite
3.16.2. Configuring Test Suites in the Properties View
3.17. Working with Test Jobs to test multiple AUTs
3.17.1. Combining Test Suites into a Test Job
3.17.2. Testing different AUTs in one test run
3.17.2.1. Testing independently started AUTs
3.17.2.2. Testing AUTs that are launched by other AUTs
3.17.3. Creating a new Test Job
3.17.4. Specifying which AUT to test in a Test Job
3.18. Information on Test Steps
3.18.1. Information on Test Steps
3.18.2. Specifying Test Steps
3.18.3. Editing Test Steps
3.19. Working with manual Test Cases
3.19.1. Creating manual tests
3.19.2. Executing and analyzing manual tests
3.20. Object mapping
3.20.1. Object mapping
3.20.2. Working with the Object Mapping Editor
3.20.2.1. Opening the Object Mapping Editor
3.20.2.2. The Object Mapping Editor
3.20.2.3. Working with categories in the Object Mapping Editor
3.20.2.4. The configuration view in the Object Mapping Editor
3.20.2.5. Refreshing the Object Mapping Editor
3.20.2.6. Finding components in the AUT via the Object Mapping Editor: highlight in AUT
3.20.3. Deleting from the Object Mapping Editor
3.20.3.1. Removing unused component names from the Object Mapping Editor
3.20.4. Collecting components (technical names) from the AUT
3.20.4.1. For Java AUTs:
3.20.4.2. For HTML AUTs:
3.20.4.3. Understanding the colored dots when collecting component names in the Object Mapping Editor
3.20.5. Mapping (assigning) collected technical names to component names
3.20.6. Object mapping and AUT changes
3.20.7. Component specific Profile
3.20.8. Viewing properties for a component in the Object Mapping Mode
3.21. Test execution
3.21.1. Prerequisites for test execution
3.21.2. Starting the AUT
3.21.3. Starting, stopping and pausing Test Suites and Test Jobs
3.21.3.1. Starting a Test Suite or a Test Job
3.21.3.2. Stopping a Test Suite or Test Job
3.21.3.3. Pausing a Test Suite or Test Job
3.21.4. Interactive test analysis
3.21.4.1. Pause on Error
3.21.4.2. Continuing after an error
3.21.5. Altering the speed of test execution
3.22. Working with test results
3.22.1. The Test Result View
3.22.2. XML/JUnit/HTML and monitoring reports
3.22.3. Working with the Test Result Summary View
3.22.3.1. Re-opening the test result view for a test run
3.22.3.2. Filtering and sorting in the Test Result Summary View
3.22.3.3. Changing the amount of result summaries shown in the Test Result Summary View
3.22.3.4. Changing the relevance of a test run
3.22.3.5. Refreshing the Test Result Summary View
3.22.3.6. Deleting test runs from the Test Result Summary View
3.22.3.7. Resetting the column widths in the Test Result Summary View
3.22.3.8. Exporting test results from the Test Result Summary View as HTML and XML reports
3.22.3.9. Entering comments for test runs in the Test Result Summary View
3.23. Producing long-term reports of test runs with BIRT
3.23.1. Generating BIRT reports
3.23.2. Writing your own BIRT reports
3.23.2.1. Creating a BIRT report
3.23.3. Showing BIRT reports in an external viewer
3.24. Working with external task repositories (ALM Integration)
3.24.1. Introduction to external task repositories
3.24.2. Configuring task repositories in your workspace
3.24.2.1. Adding a HP ALM repository to your workspace
3.24.3. Working on tasks in the ITE: contexts
3.24.3.1. Opening and editing tasks in the ITE
3.24.3.2. Working on tasks in the ITE
3.24.4. Creating tasks in external repositories from test result reports
3.24.5. Automatically reporting to external repositories after test runs
3.24.5.1. Configuring a task repository for your Project
3.24.5.2. Configuring reporting to tasks
3.24.5.3. Adding task IDs to Test Suites and Test Cases
3.24.5.4. Test execution with reporting to external repositories
3.24.5.5. Specific information for HP ALM users
3.25. Using the test executor for testing from the command line
3.25.1. Introduction to using the test executor
3.25.2. Starting the test executor
3.25.3. Parameters for the test executor
3.25.3.1. Using a separate AUT Agent or the embedded AUT Agent
3.25.3.2. Test Suites and Test Jobs
3.25.3.3. Using the dburl instead of workspace and dbscheme
3.25.3.4. Starting the test execution via testexec
3.25.3.5. Passing on arguments to the JVM
3.25.4. Using the test executor with the embedded database
3.25.5. Using a configuration file
3.25.6. Working with the no-run option
3.26. Using the dbtool client to import, delete and export from the command line
3.26.1. Introduction to the dbtool client
3.26.2. Starting the dbtool
3.26.3. Parameters for the dbtool
3.26.3.1. Deleting Projects but keeping test result summaries
3.26.3.2. Deleting test result summaries
3.26.3.3. Deleting test result details
3.26.3.4. Creating new versions of Projects
3.26.3.5. Entering version numbers in the DB Tool
3.27. Dealing with errors in tests: Event Handlers
3.27.1. Introduction to Event Handlers
3.27.2. Adding Event Handlers to a Test Case
3.27.3. Event types
3.27.4. Reentry types
3.28. Working with code coverage with Java tests
3.28.1. Working with code coverage with Java tests
3.28.2. Configuring code coverage for an AUT
3.28.2.1. Increasing the Java Heap Space for code coverage
3.28.3. Resetting and accumulating code coverage
3.28.4. Viewing the code coverage for a test run
3.28.5. Troubleshooting code coverage
3.29. Preferences
3.29.1. Introduction to preferences
3.29.2. Test preferences
3.29.3. AUT Agent preferences
3.29.4. Embedded AUT Agent preferences
3.29.5. Database preferences
3.29.5.1. Adding, editing and removing database configurations
3.29.6. Editor preferences
3.29.7. Browser preferences
3.29.8. Object mapping preferences
3.29.9. Observation mode preferences
3.29.10. Test result preferences
3.29.11. Importing and exporting database preferences
3.29.12. Label decoration preferences
3.29.13. Workspace preferences
3.29.14. General/Keys preferences
3.29.15. Help preferences
3.30. Searching
3.30.1. Searching for and opening the original specification of a Test Case or Test Suite
3.30.2. Searching for places where a Test Case or Test Suite has been used
3.30.3. Searching for places where a component name has been used
3.30.4. Searching for places where a central test data set has been used
3.30.5. Using the search dialog
3.30.5.1. Searching for keywords throughout the Project
3.30.5.2. Searching for test data
3.30.5.3. Searching for files in the workspace
3.30.5.4. Limiting the search to the selected node
3.30.6. Searching for tasks in ALM repositories
3.30.7. Using the search result view
3.30.7.1. Using search results to make wide-reaching changes to your Project
3.30.8. Searching for items in editors and browsers
3.30.9. Using filters in the ITE
3.30.9.1. Text filters
3.30.10. Other filter options
3.31. Observing Test Cases
3.31.1. Introduction to the observation mode
3.31.2. Tips and tricks for using the observation mode
3.31.3. Starting observing
3.31.4. Observing tests in Java AUTs
3.31.4.1. Actions that cannot be recorded
3.31.4.2. Performing checks in the Java observation mode
3.32. Working with the Problems View
3.32.1. The Problems View
3.33. Working with the Teststyle guidelines
3.33.1. Introduction to Teststyle
3.33.2. Activating Teststyle for a Project
3.33.3. Configuring Teststyle for a Project
3.33.3.1. Activating and deactivating individual guidelines
3.33.3.2. Setting the message level for guidelines
3.33.3.3. Configuring the attributes for guidelines
3.33.3.4. Configuring the contexts for guidelines
3.33.4. Working with the Problems View to view and fix Teststyle problems
3.33.4.1. Viewing the broken Teststyle rule from the Problems View
3.33.4.2. Using Quick Fix to fix the problem
3.33.5. Additional information for Teststyles
3.33.5.1. TODO Teststyle
3.34. Adapting the user interface
3.34.1. Introduction to the user interface
3.34.2. Moving Browsers, Views and Editors
3.34.3. Resizing in the user interface
3.34.4. Restoring user interface defaults
3.34.5. Changing perspectives
3.34.5.1. Automatically changing perspective
3.34.6. Keyboard shortcuts
3.35. Launch Configurations
3.35.1. Introduction to launch configurations
3.35.2. Requirements
3.35.3. Customizing the Perspective
3.35.4. Starting the AUT
3.35.5. AUT Agent
3.35.6. Additional information for RCP AUTs
3.35.6.1. Keyboard Layout
3.35.6.2. RCP Remote Control Plug-in
3.35.7. Common Pitfalls
3.36. Troubleshooting
3.36.1. General help
3.36.2. I can’t start the AUT Agent
3.36.3. I can’t connect to the AUT Agent
3.36.4. I can’t start the AUT
3.36.5. I can’t map components in the Object Mapping Mode
3.36.6. I can’t execute my Test Suite
3.36.7. My Test Suite failed
3.36.8. My Test Suite failed when using rdesktop
3.36.9. I can’t save my editor
3.36.10. Creating a support information package
3.36.11. Log file locations
3.37. Special characters
3.37.1. Verbatim text symbol
3.37.2. General special characters
3.37.3. Symbols with special meanings for certain parameters
3.38. Using simple matching and regular expressions for text verification
3.38.1. Introduction to regular expressions and matching
3.38.2. Using the ’matches’ operator
3.38.3. Using the ’simple match’ operator
3.39. Using relative paths in AUT configurations and as test data
3.40. Remote Debugging
3.40.1. Configuring Eclipse for remote debugging
3.41. Finishing up
3.41.1. Stopping the AUT
3.41.2. Disconnecting from the AUT Agent
3.41.3. Closing the ITE and stopping the AUT Agent
3.41.4. Stopping the AUT Agent
4. Toolkit-specific information
4.1. Introduction to this chapter
4.2. Testing Swing AUTs
4.2.1. Introduction to writing tests for Swing AUTs
4.2.2. Supported Swing AUTs
4.2.3. Design for testability in Swing
4.2.3.1. Naming components
4.2.3.2. Adding support for text retrieval
4.3. Testing RCP AUTs
4.3.1. Introduction to writing tests for RCP AUTs
4.3.2. Supported RCP AUTs
4.3.3. Setting up an RCP AUT for testing
4.3.3.1. Setting up an RCP AUT for testing as a part of the build process
4.3.4. Keyboard Layouts
4.3.5. Design for testability in RCP
4.3.5.1. Naming components
4.3.5.2. Adding support for text retrieval
4.3.6. Component name generation in RCP
4.3.7. Best practices for testing RCP AUTs
4.4. Testing GEF AUTs
4.4.1. Introduction to writing tests for GEF AUTs
4.4.2. Testing GEF components
4.4.3. Using the GEF inspector
4.5. Testing JavaFX AUTs
4.5.1. Introduction to writing tests for GEF AUTs
4.5.2. Design for testability in JavaFX
4.5.2.1. Naming components
4.5.3. Information on the support for JavaFX AUTs
4.6. Testing HTML AUTs
4.6.1. Supported HTML AUTs
4.6.2. Design for testability in HTML AUTs
5. User interface
5.1. Overview user interface
5.2. Perspectives
5.2.1. The Specification Perspective
5.2.2. The Execution Perspective
5.2.3. The Functional Test Reporting Perspective
5.2.4. The workspace perspective
5.3. Browsers
5.3.1. Introduction to browsers
5.3.2. The Test Suite Browser
5.3.3. The Test Case Browser
5.3.4. The Component Names Browser
5.4. Editors
5.4.1. Introduction to editors
5.4.2. Test Case Editor
5.4.3. Test Suite Editor
5.4.4. Object Mapping Editor
5.4.5. Central Test Data Editor
5.5. Views
5.5.1. Introduction to views
5.5.2. The Properties View
5.5.3. The Data Sets View
5.5.4. The Component Names View
5.5.5. The Test Result View
5.5.6. The Problem View
5.5.7. The search result view
5.5.8. The Description View
5.5.9. The Navigator View
5.5.10. The console
5.5.11. The Inspector View
5.5.12. The Test Result Summary View
5.5.13. The Running AUTs View
5.5.14. The Image View
5.5.15. The Log View
5.5.16. The Progress View
5.6. The status bar
6. Concepts
6.1. Introduction to this chapter
6.2. Overview
6.3. Testing
6.3.1. Introduction to testing applications
6.3.2. Understanding how the ITE and test execution work
6.3.2.1. Actions
6.3.2.2. Test execution
6.3.3. Standards conformance
6.4. Architecture
6.4.1. Introduction to architecture
6.4.2. ITE
6.4.3. AUT Agent
6.4.4. Working with the ITE and AUT Agent on different machines
6.5. Database structure
6.5.1. Introduction to database structure
6.5.2. Supported systems
6.5.3. Single-user
6.5.4. Multi-user
6.6. Approaches to testing
6.6.1. Writing modules in advance
6.6.2. Creating modules from existing Test Cases
6.6.3. Choosing a method
6.7. Test hierarchy
6.7.1. Introduction to the test hierarchy
6.7.2. Test Steps
6.7.3. Test Cases
6.7.4. Test Suites
6.7.5. Test Jobs
6.7.6. Projects
6.8. Reusability
6.8.1. Introduction to the reusability of test cases
6.8.2. Abstract, concrete and toolkit specific components
6.9. Object mapping
6.9.1. Introduction to the object mapping
6.9.2. Component names
6.9.3. Technical names
6.9.4. Assigning technical names to component names
6.9.5. Locating components during test execution
6.10. Test execution
6.10.1. Introduction to the test execution
6.10.2. Test Step execution
6.11. Observing user actions
6.12. Event Handlers
6.12.1. Event Handlers
6.12.2. How Event Handlers work
6.12.3. Default Event Handlers
6.12.4. Customized Event Handlers
6.13. Extensibility
6.14. Summary
7. Glossary


Copyright BREDEX GmbH 2015. Made available under the Eclipse Public License v1.0.