$ behat --definitions="field"(or simply behat -dfield)web_features | When /^(?:|I )fill in "(?P<field>(?:[^"]|\\")*)" with "(?P<value>(?:[^"]|\\")*)"$/ | Fills in form field with specified id|name|label|value. | at `Behat\MinkExtension\Context\MinkContext::fillField()`web_features | When /^(?:|I )fill in "(?P<field>(?:[^"]|\\")*)" with:$/ | Fills in form field with specified id|name|label|value. | at `Behat\MinkExtension\Context\MinkContext::fillField()`
Format & Output Example
f you don’t want to print output to the console, you can tell behat to print output to a file instead of STDOUT with the --out option:
Some formatters, like junit, always require the --out option to be specified.
The junit formatter generates *.xml files for every suite, so it needs a destination directory to put these XML files into.
In this case, output of pretty formatter will be written to ~/pretty.out file, output of junit formatter will be written to xml folder and progress formatter will just print to console.
--format progress
T:\git\behatmink (master -> origin)
λ bin\behat -c bh-p-linear.yml --format progress --tags=smoke
..............F.....F
--- Failed steps:
001 Scenario: Test Should Fail # features\ReflectionTest.feature:11
Then Test Case Fails # features\ReflectionTest.feature:14
Failed asserting that false is true.
002 Scenario: Test Should Fail # features\smoke\ReflectionSmokeTest.feature:12
Then Test Case Fails # features\smoke\ReflectionSmokeTest.feature:15
Failed asserting that false is true.
7 scenarios (5 passed, 2 failed)
21 steps (19 passed, 2 failed)
0m1.01s (10.07Mb)
--format pretty
λ bin\behat -c bh-p-linear.yml --format pretty --tags=smoke
@smoke
Feature: Title of your feature
I want to use this template for my feature file
Scenario Outline: Data Driven Testing # features\datadriven.feature:12
Given I have data for <name> field # FeatureContext::iHaveDataForNameField()
When I use data <value> for entry # FeatureContext::iUseDataForEntry()
Then I receive data <status> back from for # FeatureContext::iReceiveDataSuccessBackFromFor()
Examples:
| name | value | status |
| name1 | 5 | success |
│ Name: name1
│ Value: 5
│ Status: success
| name2 | 7 | Fail |
│ Name: name2
│ Value: 7
│ Status: Fail
| name3 | 8 | success |
│ Name: name3
│ Value: 8
│ Status: success
@smoke
Feature: Reflection Test Feature
As an user, I want to be able to perform
reflection tests
Scenario: Test Should Pass # features\ReflectionTest.feature:6
Given Test Should Pass # FeatureContext::testShouldPass()
When Test Result Passes # FeatureContext::testResultPasses()
Then Test Case Passes # FeatureContext::testCasePasses()
Scenario: Test Should Fail # features\ReflectionTest.feature:11
Given Test Should Pass # FeatureContext::testShouldPass()
When Test Result Passes # FeatureContext::testResultPasses()
Then Test Case Fails # FeatureContext::testCaseFails()
Failed asserting that false is true.
@smoke @class_test
Feature: Reflection Test Feature2
As an user, I want to be able to perform
reflection tests
@smoke
Scenario: Test Should Pass # features\smoke\ReflectionSmokeTest.feature:6
Given Test Should Pass # FeatureContext::testShouldPass()
When Test Result Passes # FeatureContext::testResultPasses()
Then Test Case Passes # FeatureContext::testCasePasses()
@smoke
Scenario: Test Should Fail # features\smoke\ReflectionSmokeTest.feature:12
Given Test Should Pass # FeatureContext::testShouldPass()
When Test Result Passes # FeatureContext::testResultPasses()
Then Test Case Fails # FeatureContext::testCaseFails()
Failed asserting that false is true.
--- Failed scenarios:
features\ReflectionTest.feature:11
features\smoke\ReflectionSmokeTest.feature:12
7 scenarios (5 passed, 2 failed)
21 steps (19 passed, 2 failed)
0m0.97s (10.13Mb)
T:\git\behatmink (master -> origin)
λ
BEHAT_PARAMS
Environment specific settings
Some of the settings in behat.yml are environment specific. For example the base URL may be http://mysite.localhost on your local development environment, while on a test server it might be http://127.0.0.1:8080. Some other environment specific settings are the Drupal root path and the paths to search for subcontexts.
If you intend to run your tests on different environments these settings should not be committed to behat.yml. Instead they should be exported in an environment variable. Before running tests Behat will check the BEHAT_PARAMS environment variable and add these settings to the ones that are present in behat.yml. This variable should contain a JSON object with your settings.