zondag 18 oktober 2015

Tosca Tips & Tricks: notes on Manual Testing module of TCP

Notes on Powerpoint Presentation by Tricentis as part of the 'Manual Testing' module of TCP:
  •  Hierarchy of elements TestCases section:
    • TestCase Folder
    • TestCase
    • TestStepFolder
    • Manual TestStep
    • Manual TestStepValue
  • The name of a TestCase should be unique
  • A hand on a symbol -> manual
  • Scratchbook is a temporary aid to execute teststeps
    • Scratchbook results are not stored
  • ActionMode
    • Determines how to process value-entry for each TestStepValue
    • Default ActionMode: DoNothing
    • Hide DoNothing TestStepValues with F9
    • Reset Value > Reset value in value field and set ActionMode to DoNothing
    • ActionMode: Input
    • ActionMode: Verify
      • Verification type: ==, !=, < (if numeric), > (if numeric)
      • Wildcards (*,?) possible
  • Hierarchy of elements ExecutionLists section:
    • ExecutionListFolder
    • ExecutionList
    • ActualLog
    • ExecutionEntryFolder
    • ExecutionEntry
  • ExecutionList: test result colors:
    • Green: passed
    • Red: failed
    • White: not executed
    • Grey: no longer available in workspace
  • ExecutionLists which have TestCaseFolder/TestCaseInstances added to them can be updated with synchronize
  • Test Resuls can be set manually using the command set result
  •  Manual Engine options:
    • Add comment
    • Pausing / Resuming
    • Multipassing / Set passed until here
    • Run from here (=rollback)
    • Exit
    • Take screenshots
  • ExecutionList vs. RequirementSet
    • If TestCaseLinks in the RequirementSet have their testcases included in multiple ExecutionLists which are added to the RequirementSet then the property 'ResultAggregation' gives you control over which results are displayed:
      • First: only the topmost ExecutionList's results
      • Each: self explanatory


Notes on videos by Tricentis as part of the ''Manual Testing' module of TCP:
  •  Element order:
    • Testcase
    • Teststep Folder
    • Manual Teststep
    • Manual Teststep Value
  • Create Manual Teststep: Ctrl+N; Ctrl+M
  • Create Manual TeststepValue: Ctrl+N; Ctrl+M
  • Follow instructions at the top of the manual testing popup
    • Click the green tick for success, and the red cross for failure
    • At the bottom row of the popup colored blocks represent the different teststep results (green/red)
  • ActionMode 'Verify'. Green colored. Speaks for itself. 
  • Testcase vs. Requirements
    • Drag and drop a testcase on a requirement and a 'Testcase Link' is created (yellow circular arrow with blue ribbon)
  • Requirements column: Coverage Specified (%) & Testcase Link symbol vs. TestCaseWorkstate
    • TestCase Workstate: PLANNED
      • Coverage Specified: 20%
      • Testcase Link symbol has a wrench pointing to three o' clock
    • TestCase Workstate: IN_WORK
      • Coverage Specified: 50%
      • Testcase Link symbol has a wrench pointing to twelve o' clock
    • TestCase Workstate: COMPLETED
      • Coverage Specified: 100%
      • Testcase Link symbol has no longer a wrench in it
  • ExecutionList (green triangle with a shadow) 
    • Has ActualLog where testresults are stored
      • Column 'Loginfo' displays testresults (white/green/red)
      • Loginfo also stores comments/errors of failed TestSteps/TestStepValues
  • Testcase vs. ExecutionList
    • Drag and drop a testcase on an ExecutionList and an 'ExecutionEntry' is created
  • ExecutionList vs. RequirementSet
    • Drag and drop an ExecutionList to a RequirementSet and
      • The ExecutionList will appear underneath the RequirementSet
      • The RequirementSet will compare the results of Testcases in the ExecutionList with the TestCaseLinks it has in its own requirement leafs
        • And adjust the data accordingly in columns such as:
          • Coverage Execution (%)
          • Execution State (%)
        • TestCaseLinks which are given an execution result by the ExecutionList get a green tick or a red cross in the right lower corner of their symbol
        • If the RequirementLeafs have been weighted then the weights will be taken into consideration when adjusting the data in these columns
    • You can drag and drop more than one ExecutionList to a RequirementSet
      • This is not suprising, because the 'One Testsheet per Requirement' rule means there are several 'TestcaseTemplates' worth of testcases that need to be linked to one RequirementSet. And it would feel forced if you were to first put all those testcases in one and the same ExecutionList just to link them to the proper RequirementSet.
  • Printing a report
    • Make sure the view is set to the window you want to print
    • Click 'print review'

dinsdag 13 oktober 2015

Tosca Tips & Tricks: notes on TestCase Design module of TCP

Notes on Powerpoint Presentation by Tricentis as part of the 'TestCase Design' module of TCP:
  • Testcase Design section elements (p.11)
    • TestCase Design Folder => TestSheet => Attribute => Instance
  • Attribute properties (p.12+)
    • AttrType:
      • Logical (grey top)
      • Physical (red top / no coloured top)
    • BusinessRelevant:
      • No (white top)
      • Yes (red top / no coloured top)
      • Result (green top)
  • Instance properties (p.15):
    • Character (F7)
      • Valid
      • Invalid
      • Straight Through
    • Position (F8)
      • Inner
      • Boundary
  • Equivalence class (p.16)
    • Each representative produces the same error =>1 representative value per class
    • Work because the same result is expected for each and every representative of the same class
  • Equivalence Partitioning (p.17+)
    • Boundary value errors are one-time errors and don't increase coverage (therefore: not part of regression)
  • Floating attributes (numerical values)
  • Non-floating attributes (enumeration of values)
    • Avoid testing for all invalid values, only needed/requested ones
  • Straight Through (p.27)
    • It is the test case with the highest risk
    • It is the test case which can be most easily implemented
    • It is the test case which offers the best flexibility to be combined with the other attributes
  • Standard TestSheet: subdivide attributes in:
    • Preconditions
    • The processes under test
    • Verifications
  • Linear Expansion
    • One defined Straight Through
    • One test focus per test case

Notes on videos by Tricentis as part of the 'TestCase Design' module of TCP:
  • First placeholder note
  •  Arranging Instances (rightclick attribute => Arrange instances)
    • Order: Straight through, Valid instances, Invalid instances, Boundary values
  • Generating Instances
    • Rightclick on attribute => Create instance (both in left- and right side of Tosca viewing screen - the latter even by just manually typing a new instance value in an attribute field)
  • Combining attributes at a higher level
    • Create Instance and delete the single empty instance at the desired combination level leaving only an Instance Folder
    • Select the attributes to combine
    • Rightclick the select instances -> Complete Instances -> Linear Expansion
  • Instance Filter
    • User the Filter column in the instance selection screen to select for particular attribute values
    • Particularly handy when checking for reduntant testcases
    • 'Reset Instance Filter' when done comparing
  • Redundant testcases
    • Just delete them
  • Best Practices for Instances
    • Naming: separate different aspects of an attribute value and/or instance with pipes
  • Verification Attributes (green top)
    •  (Often, always?) no need to create an instance at the top attribute (verifications are RESULTS of the instances, not instances themselves)
  • Creating Instances on a Testsheet level
    • Make sure the instances of the testsheets attributes (if any) are generated
    • Rightclick -> Completed Instances -> Linear Expansion
      •  Verifications will be ignored when it comes to the combinatorics
  • Combinatorics
    • Linear Expansion
      • Contributions of an attribute with N instances to the total number of testcases of the instance above it: N-1 (with the '-1' being the straight through)
  • Verifications
    • Choose the value 'N/A' for verification attributes which are not relevant (such as error message for a valid instance or success messages for invalid ones).
  • Best Practices
    • Create one Testsheet per requirement
  • TestCaseDesign vs. Requirements
    • You can only drag-and-drop a Testsheet on a Leaf requirement and not on a branch requirement
    • If a Testsheet is drag-and-dropped on a requirement a 'Testcase Substitute Link' is created for every Instance of the Testsheet (yellow circular arrow with red  ribbon)