Manybubbles has uploaded a new change for review.

  https://gerrit.wikimedia.org/r/124385

Change subject: WIP: Faster, more stable browser tests
......................................................................

WIP: Faster, more stable browser tests

Speed up browser tests by switching from a browser to the api to create pages.
We're not testing that functionality any way.

With the savings we can increase the timeouts on delayed update actions
without feeling like we're "taking forvever".  For the most part the whole
thing should be faster even with the increases.

Needs Id0eb49fe3ac1aee3d320e9a1a6a52c7d89f7e2b9 before it'll actually work.

Change-Id: Ic5290e1d811a5634fc4111da3d2b14479dec5d5a
---
M tests/browser/Gemfile
M tests/browser/Gemfile.lock
M tests/browser/features/step_definitions/page_steps.rb
M tests/browser/features/update_general.feature
M tests/browser/features/update_non_existant.feature
M tests/browser/features/update_redirect_loop.feature
M tests/browser/features/update_weight.feature
7 files changed, 87 insertions(+), 86 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/mediawiki/extensions/CirrusSearch 
refs/changes/85/124385/1

diff --git a/tests/browser/Gemfile b/tests/browser/Gemfile
index e33d9cc..9dca1c6 100644
--- a/tests/browser/Gemfile
+++ b/tests/browser/Gemfile
@@ -5,3 +5,4 @@
 
 gem "mediawiki_selenium", "~> 0.2.0"
 gem "parallel_tests"
+gem "mediawiki_api"
diff --git a/tests/browser/Gemfile.lock b/tests/browser/Gemfile.lock
index 9dc8559..1e36e33 100644
--- a/tests/browser/Gemfile.lock
+++ b/tests/browser/Gemfile.lock
@@ -14,14 +14,26 @@
       faker (>= 1.1.2)
       yml_reader (>= 0.2)
     diff-lcs (1.2.5)
+    domain_name (0.5.16)
+      unf (>= 0.0.5, < 1.0.0)
     faker (1.3.0)
       i18n (~> 0.5)
+    faraday (0.9.0)
+      multipart-post (>= 1.2, < 3)
+    faraday-cookie_jar (0.0.6)
+      faraday (>= 0.7.4)
+      http-cookie (~> 1.0.0)
     ffi (1.9.3)
     gherkin (2.12.2)
       multi_json (~> 1.3)
     headless (1.0.1)
+    http-cookie (1.0.2)
+      domain_name (~> 0.5)
     i18n (0.6.9)
     json (1.8.1)
+    mediawiki_api (0.1.0)
+      faraday (~> 0.9.0)
+      faraday-cookie_jar (~> 0.0.6)
     mediawiki_selenium (0.2.10)
       cucumber (~> 1.3, >= 1.3.10)
       headless (~> 1.0, >= 1.0.1)
@@ -34,6 +46,7 @@
     mime-types (2.1)
     multi_json (1.9.0)
     multi_test (0.0.3)
+    multipart-post (2.0.0)
     net-http-persistent (2.9.4)
     page-object (0.9.7)
       page_navigation (>= 0.9)
@@ -55,6 +68,9 @@
       rubyzip (~> 1.0)
       websocket (~> 1.0.4)
     syntax (1.2.0)
+    unf (0.1.3)
+      unf_ext
+    unf_ext (0.0.6)
     watir-webdriver (0.6.8)
       selenium-webdriver (>= 2.18.0)
     websocket (1.0.7)
@@ -64,5 +80,6 @@
   ruby
 
 DEPENDENCIES
+  mediawiki_api
   mediawiki_selenium (~> 0.2.0)
   parallel_tests
diff --git a/tests/browser/features/step_definitions/page_steps.rb 
b/tests/browser/features/step_definitions/page_steps.rb
index 0a3dc0c..388ad72 100644
--- a/tests/browser/features/step_definitions/page_steps.rb
+++ b/tests/browser/features/step_definitions/page_steps.rb
@@ -11,15 +11,16 @@
 end
 
 Given(/^a page named (.*) doesn't exist$/) do |title|
-  visit(ArticlePage, using_params: {page_name: title}) do |page|
-    (page.create_source_link? or page.create_link?).should be_true
-  end
+  step("I delete #{title}")
 end
 
 When(/^I delete (?!the second)(.+)$/) do |title|
-  visit(DeletePage, using_params: {page_name: title}) do |page|
-    page.delete
-  end
+  require "mediawiki_api"
+  client = MediawikiApi::Client.new("#{ENV['MEDIAWIKI_URL']}../w/api.php", 
false)
+  client.log_in(ENV['MEDIAWIKI_USER'], ENV['MEDIAWIKI_PASSWORD'])
+  result = client.delete_page(title, "Testing")
+  result.status.should eq 200
+  result.body.should_not include '{"error":{"code"'
 end
 When(/^I edit (.+) to add (.+)$/) do |title, text|
   edit_page(title, text, true)
@@ -46,34 +47,24 @@
   if text.start_with?("@")
     text = File.read("features/support/articles/" + text[1..-1])
   end
-  visit(EditPage, using_params: {page_name: title}) do |page|
-    if (!page.article_text? and page.login?) then
-      # Looks like we're not being given the article text probably because 
we're
-      # trying to edit an article that requires us to be logged in.  Lets try
-      # logging in.
-      step "I am logged in"
-      visit(EditPage, using_params: {page_name: title})
-    end
-    if (page.article_text.strip != text.strip) then
-      if (!page.save? and page.login?) then
-        # Looks like I'm at a page I don't have permission to change and I'm 
not
-        # logged in.  Lets log in and try again.
-        step "I am logged in"
-        visit(EditPage, using_params: {page_name: title})
-      end
-      if !add then
-        page.article_text = ""
-      end
-      # Firefox chokes on huge batches of text so split it into chunks and use
-      # send_keys rather than page-objects built in += because that clears and
-      # resends everything....
-
-      # Note that this doens't work with tabs!!!!!!!
-      text.chars.each_slice(1000) do |chunk|
-        page.article_text_element.send_keys(chunk)
-      end
-      page.save
-    end
+  require "mediawiki_api"
+  client = MediawikiApi::Client.new("#{ENV['MEDIAWIKI_URL']}../w/api.php", 
false)
+  fetched_text = client.get_wikitext("Main Page")
+  if fetched_text.status == 404 then
+    fetched_text = ""
+  else
+    fetched_text.status.should eq 200
+    fetched_text = fetched_text.body.strip
+  end
+  if (add)
+    # Note that the space keeps from jamming words together
+    text = fetched_text + " " + text
+  end
+  if (fetched_text.strip != text.strip) then
+    client.log_in(ENV['MEDIAWIKI_USER'], ENV['MEDIAWIKI_PASSWORD'])
+    result = client.create_page(title, text)
+    result.status.should eq 200
+    result.body.should_not include '{"error":{"code"'
   end
 end
 
diff --git a/tests/browser/features/update_general.feature 
b/tests/browser/features/update_general.feature
index 6ee95fe..dd85b0f 100644
--- a/tests/browser/features/update_general.feature
+++ b/tests/browser/features/update_general.feature
@@ -1,34 +1,34 @@
 @clean
 Feature: Search backend updates
-  Background:
-    Given I am logged in
-
   Scenario: Deleted pages are removed from the index
     Given a page named DeleteMe exists
-    Then within 10 seconds searching for DeleteMe yields DeleteMe as the first 
result
+    And I am at a random page
+    Then within 20 seconds searching for DeleteMe yields DeleteMe as the first 
result
     When I delete DeleteMe
-    Then within 10 seconds searching for DeleteMe yields none as the first 
result
+    Then within 20 seconds searching for DeleteMe yields none as the first 
result
 
   Scenario: Altered pages are updated in the index
     Given a page named ChangeMe exists with contents foo
     When I edit ChangeMe to add superduperchangedme
-    Then within 10 seconds searching for superduperchangedme yields ChangeMe 
as the first result
+    And I am at a random page
+    Then within 20 seconds searching for superduperchangedme yields ChangeMe 
as the first result
 
   Scenario: Pages containing altered template are updated in the index
     Given a page named Template:ChangeMe exists with contents foo
     And a page named ChangeMyTemplate exists with contents 
{{Template:ChangeMe}}
     When I edit Template:ChangeMe to add superduperultrachangedme
-    # Updating a template uses the job queue and that can take quite a while 
to complete in beta
-    Then within 10 seconds searching for superduperultrachangedme yields 
ChangeMyTemplate as the first result
+    And I am at a random page
+    Then within 20 seconds searching for superduperultrachangedme yields 
ChangeMyTemplate as the first result
 
   # This test doesn't rely on our paranoid revision delete handling logic, 
rather, it verifies what should work with the
   # logic with a similar degree of paranoia
   Scenario: When a revision is deleted the page is updated regardless of if 
the revision is current
-    Given a page named RevDelTest exists with contents first
+    Given I am logged in 
+    And a page named RevDelTest exists with contents first
     And a page named RevDelTest exists with contents delete this revision
-    And within 10 seconds searching for intitle:RevDelTest "delete this 
revision" yields RevDelTest as the first result
+    And within 20 seconds searching for intitle:RevDelTest "delete this 
revision" yields RevDelTest as the first result
     And a page named RevDelTest exists with contents current revision
     When I delete the second most recent revision of RevDelTest
-    Then within 10 seconds searching for intitle:RevDelTest "delete this 
revision" yields none as the first result
+    Then within 20 seconds searching for intitle:RevDelTest "delete this 
revision" yields none as the first result
     When I search for intitle:RevDelTest current revision
     Then RevDelTest is the first search result
diff --git a/tests/browser/features/update_non_existant.feature 
b/tests/browser/features/update_non_existant.feature
index fa189b0..1d75931 100644
--- a/tests/browser/features/update_non_existant.feature
+++ b/tests/browser/features/update_non_existant.feature
@@ -1,51 +1,46 @@
 @clean
 Feature: Search backend updates that reference non-existant pages
-  Background:
-    Given I am logged in
-
   @non_existant
   Scenario: Pages that link to non-existant pages still get their search index 
updated
     Given a page named IDontExist doesn't exist
     And a page named ILinkToNonExistantPages%{epoch} exists with contents 
[[IDontExist]]
-    When I search for ILinkToNonExistantPages%{epoch}
-    Then ILinkToNonExistantPages%{epoch} is the first search result
+    When I am at a random page
+    Then within 10 seconds searching for ILinkToNonExistantPages%{epoch} 
yields ILinkToNonExistantPages%{epoch} as the first result
 
   @non_existant
   Scenario: Pages that redirect to non-existant pages don't throw errors
     Given a page named IDontExist doesn't exist
     When a page named IRedirectToNonExistantPages%{epoch} exists with contents 
#REDIRECT [[IDontExist]]
-    Then I am on a page titled IRedirectToNonExistantPages%{epoch}
 
   @non_existant
   Scenario: Linking to a non-existant page doesn't add it to the search index 
with an [INVALID] word count
-    Given a page named IDontExistLink%{epoch} doesn't exist
-    And a page named ILinkToNonExistantPages%{epoch} exists with contents 
[[IDontExistLink%{epoch}]]
-    When I search for IDontExistLink%{epoch}
-    Then there are no search results with [INVALID] words in the data
+    Given a page named ILinkToNonExistantPages%{epoch} exists with contents 
[[IDontExistLink%{epoch}]]
+    When I am at a random page
+    Then within 20 seconds searching for IDontExistLink%{epoch} yields 
ILinkToNonExistantPages%{epoch} as the first result
+    And there are no search results with [INVALID] words in the data
     When a page named IDontExistLink%{epoch} exists
-    And I search for IDontExistLink%{epoch}
-    Then IDontExistLink%{epoch} is the first search result
+    Then within 10 seconds searching for IDontExistLink%{epoch} yields 
IDontExistLink%{epoch} as the first result
     And there are no search results with [INVALID] words in the data
 
   @non_existant
   Scenario: Redirecting to a non-existing page doesn't add it to the search 
index with an [INVALID] word count
-    Given a page named IDontExistRdir%{epoch} doesn't exist
-    And a page named IRedirectToNonExistantPages%{epoch} exists with contents 
#REDIRECT [[IDontExistRdir%{epoch}]]
-    When I search for IDontExistRdir%{epoch}
-    Then there are no search results with [INVALID] words in the data
-    When a page named IDontExistRdir%{epoch} exists
+    Given a page named IRedirectToNonExistantPages%{epoch} exists with 
contents #REDIRECT [[IDontExistRdir%{epoch}]]
+    And I am at a random page
+    When wait 5 seconds for the index to get the page
     And I search for IDontExistRdir%{epoch}
-    Then IDontExistRdir%{epoch} is the first search result
+    And there are no search results with [INVALID] words in the data
+    When a page named IDontExistRdir%{epoch} exists
+    Then within 10 seconds searching for IDontExistRdir%{epoch} yields 
IDontExistRdir%{epoch} as the first result
     And there are no search results with [INVALID] words in the data
 
   @non_existant
   Scenario: Linking to a page that redirects to a non-existing page doesn't 
add it to the search index with an [INVALID] word count
-    Given a page named IDontExistRdirLinked%{epoch} doesn't exist
-    And a page named IRedirectToNonExistantPagesLinked%{epoch} exists with 
contents #REDIRECT [[IDontExistRdirLinked%{epoch}]]
+    Given a page named IRedirectToNonExistantPagesLinked%{epoch} exists with 
contents #REDIRECT [[IDontExistRdirLinked%{epoch}]]
     And a page named ILinkIRedirectToNonExistantPages%{epoch} exists with 
contents [[IRedirectToNonExistantPagesLinked%{epoch}]]
-    When I search for IDontExistRdirLinked%{epoch}
-    Then there are no search results with [INVALID] words in the data
+    And I am at a random page
+    When wait 5 seconds for the index to get the page
+    And I search for IDontExistRdir%{epoch}
+    And there are no search results with [INVALID] words in the data
     When a page named IDontExistRdirLinked%{epoch} exists
-    And I search for IDontExistRdirLinked%{epoch}
-    Then IDontExistRdirLinked%{epoch} is the first search result
+    Then within 10 seconds searching for IDontExistRdirLinked%{epoch} yields 
IDontExistRdirLinked%{epoch} as the first result
     And there are no search results with [INVALID] words in the data
diff --git a/tests/browser/features/update_redirect_loop.feature 
b/tests/browser/features/update_redirect_loop.feature
index 15b8970..959b1b8 100644
--- a/tests/browser/features/update_redirect_loop.feature
+++ b/tests/browser/features/update_redirect_loop.feature
@@ -1,19 +1,14 @@
 @clean
 Feature: Search backend updates containing redirect loops
-  Background:
-    Given I am logged in
-
   @redirect_loop
   Scenario: Pages that redirect to themself don't throw errors
-    When a page named IAmABad RedirectSelf%{epoch} exists with contents 
#REDIRECT [[IAmABad RedirectSelf%{epoch}]]
-    Then I am on a page titled IAmABad RedirectSelf%{epoch}
+    Then a page named IAmABad RedirectSelf%{epoch} exists with contents 
#REDIRECT [[IAmABad RedirectSelf%{epoch}]]
 
+  # The actual creation of the pages will fails if redirect loops fails
   @redirect_loop
   Scenario: Pages that form a redirect chain don't throw errors
     When a page named IAmABad RedirectChain%{epoch} A exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} B]]
     And a page named IAmABad RedirectChain%{epoch} B exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} C]]
     And a page named IAmABad RedirectChain%{epoch} C exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} D]]
-    And a page named IAmABad RedirectChain%{epoch} D exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} A]]
-    Then I am on a page titled IAmABad RedirectChain%{epoch} D
-    When a page named IAmABad RedirectChain%{epoch} B exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} D]]
-    Then I am on a page titled IAmABad RedirectChain%{epoch} B
+    Then a page named IAmABad RedirectChain%{epoch} D exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} A]]
+    And a page named IAmABad RedirectChain%{epoch} B exists with contents 
#REDIRECT [[IAmABad RedirectChain%{epoch} D]]
diff --git a/tests/browser/features/update_weight.feature 
b/tests/browser/features/update_weight.feature
index 9016c84..02978c8 100644
--- a/tests/browser/features/update_weight.feature
+++ b/tests/browser/features/update_weight.feature
@@ -1,8 +1,5 @@
 @clean
 Feature: Page updates trigger appropriate weight updates in newly linked and 
unlinked articles
-  Background:
-    Given I am logged in
-
   # Note that these tests can be a bit flakey if you don't use Redis and 
checkDelay because they count using
   # Elasticsearch which delays all updates for around a second.  So if the 
jobs run too fast they won't work.
   # Redis and checkDelay fix this by forcing a delay.
@@ -10,7 +7,8 @@
     Given a page named WeightedLink%{epoch} 1 exists
     And a page named WeightedLink%{epoch} 2/1 exists with contents 
[[WeightedLink%{epoch} 2]]
     And a page named WeightedLink%{epoch} 2 exists
-    Then within 10 seconds searching for WeightedLink%{epoch} yields 
WeightedLink%{epoch} 2 as the first result
+    When I am at a random page
+    Then within 20 seconds searching for WeightedLink%{epoch} yields 
WeightedLink%{epoch} 2 as the first result
     When a page named WeightedLink%{epoch} 1/1 exists with contents 
[[WeightedLink%{epoch} 1]]
     And a page named WeightedLink%{epoch} 1/2 exists with contents 
[[WeightedLink%{epoch} 1]]
     Then within 20 seconds searching for WeightedLink%{epoch} yields 
WeightedLink%{epoch} 1 as the first result
@@ -21,7 +19,8 @@
     And a page named WeightedLinkRemoveUpdate%{epoch} 1 exists
     And a page named WeightedLinkRemoveUpdate%{epoch} 2/1 exists with contents 
[[WeightedLinkRemoveUpdate%{epoch} 2]]
     And a page named WeightedLinkRemoveUpdate%{epoch} 2 exists
-    Then within 10 seconds searching for WeightedLinkRemoveUpdate%{epoch} 
yields WeightedLinkRemoveUpdate%{epoch} 1 as the first result
+    When I am at a random page
+    Then within 20 seconds searching for WeightedLinkRemoveUpdate%{epoch} 
yields WeightedLinkRemoveUpdate%{epoch} 1 as the first result
     When a page named WeightedLinkRemoveUpdate%{epoch} 1/1 exists with 
contents [[Junk]]
     And a page named WeightedLinkRemoveUpdate%{epoch} 1/2 exists with contents 
[[Junk]]
     Then within 20 seconds searching for WeightedLinkRemoveUpdate%{epoch} 
yields WeightedLinkRemoveUpdate%{epoch} 2 as the first result
@@ -32,7 +31,8 @@
     And a page named WeightedLinkRdir%{epoch} 2/Redirect exists with contents 
#REDIRECT [[WeightedLinkRdir%{epoch} 2]]
     And a page named WeightedLinkRdir%{epoch} 2/1 exists with contents 
[[WeightedLinkRdir%{epoch} 2/Redirect]]
     And a page named WeightedLinkRdir%{epoch} 2 exists
-    Then within 10 seconds searching for WeightedLinkRdir%{epoch} yields 
WeightedLinkRdir%{epoch} 2 as the first result
+    When I am at a random page
+    Then within 20 seconds searching for WeightedLinkRdir%{epoch} yields 
WeightedLinkRdir%{epoch} 2 as the first result
     When a page named WeightedLinkRdir%{epoch} 1/1 exists with contents 
[[WeightedLinkRdir%{epoch} 1/Redirect]]
     And a page named WeightedLinkRdir%{epoch} 1/2 exists with contents 
[[WeightedLinkRdir%{epoch} 1/Redirect]]
     Then within 20 seconds searching for WeightedLinkRdir%{epoch} yields 
WeightedLinkRdir%{epoch} 1 as the first result
@@ -45,7 +45,8 @@
     And a page named WLRURdir%{epoch} 2/Redirect exists with contents 
#REDIRECT [[WLRURdir%{epoch} 2]]
     And a page named WLRURdir%{epoch} 2/1 exists with contents 
[[WLRURdir%{epoch} 2/Redirect]]
     And a page named WLRURdir%{epoch} 2 exists
-    Then within 10 seconds searching for WLRURdir%{epoch} yields 
WLRURdir%{epoch} 1 as the first result
+    When I am at a random page
+    Then within 20 seconds searching for WLRURdir%{epoch} yields 
WLRURdir%{epoch} 1 as the first result
     When a page named WLRURdir%{epoch} 1/1 exists with contents [[Junk]]
     And a page named WLRURdir%{epoch} 1/2 exists with contents [[Junk]]
     Then within 20 seconds searching for WLRURdir%{epoch} yields 
WLRURdir%{epoch} 2 as the first result
@@ -60,4 +61,5 @@
     And a page named WLDoubleRdir%{epoch} 2/1 exists with contents 
[[WLDoubleRdir%{epoch} 2/Redirect]]
     And a page named WLDoubleRdir%{epoch} 2/2 exists with contents 
[[WLDoubleRdir%{epoch} 2/Redirect]]
     And a page named WLDoubleRdir%{epoch} 2 exists
-    When within 20 seconds searching for WLDoubleRdir%{epoch} yields 
WLDoubleRdir%{epoch} 2 as the first result
+    When I am at a random page
+    Then within 20 seconds searching for WLDoubleRdir%{epoch} yields 
WLDoubleRdir%{epoch} 2 as the first result

-- 
To view, visit https://gerrit.wikimedia.org/r/124385
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: Ic5290e1d811a5634fc4111da3d2b14479dec5d5a
Gerrit-PatchSet: 1
Gerrit-Project: mediawiki/extensions/CirrusSearch
Gerrit-Branch: master
Gerrit-Owner: Manybubbles <[email protected]>

_______________________________________________
MediaWiki-commits mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to