Aaron Schulz has uploaded a new change for review.

  https://gerrit.wikimedia.org/r/111116

Change subject: Fixes and cleanups to FileOpBatch
......................................................................

Fixes and cleanups to FileOpBatch

* Fixed bug where operations that failed after precheck() would
  not properly update the Status. This could cause failed ops to
  be treated as successful.
* Removed special casing for 1-sized batches, this was mostly
  for Swift and is irrelevant after it was rewritten.
* Removed unused return value and fixed the docs

bug: 60318
Change-Id: I7f12ebf711bc196313745f943070f8bdb6335964
(cherry picked from commit 4db00dc1847f40cddfe454158ab7023809704d8f)
---
M includes/filebackend/FileOpBatch.php
1 file changed, 4 insertions(+), 11 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/mediawiki/core 
refs/changes/16/111116/1

diff --git a/includes/filebackend/FileOpBatch.php 
b/includes/filebackend/FileOpBatch.php
index 32b65ba..3adfbb3 100644
--- a/includes/filebackend/FileOpBatch.php
+++ b/includes/filebackend/FileOpBatch.php
@@ -149,9 +149,8 @@
         * within any given sub-batch do not depend on each other.
         * This will abort remaining ops on failure.
         *
-        * @param array $pPerformOps
+        * @param array $pPerformOps Batches of file ops (batches use original 
indexes)
         * @param Status $status
-        * @return bool Success
         */
        protected static function runParallelBatches( array $pPerformOps, 
Status $status ) {
                $aborted = false; // set to true on unexpected errors
@@ -172,12 +171,8 @@
                        // If attemptAsync() returns a Status, it was either 
due to an error
                        // or the backend does not support async ops and did it 
synchronously.
                        foreach ( $performOpsBatch as $i => $fileOp ) {
-                               if ( !$fileOp->failed() ) { // failed => 
already has Status
-                                       // If the batch is just one operation, 
it's faster to avoid
-                                       // pipelining as that can involve 
creating new TCP connections.
-                                       $subStatus = ( count( $performOpsBatch 
) > 1 )
-                                               ? $fileOp->attemptAsync()
-                                               : $fileOp->attempt();
+                               if ( !isset( $status->success[$i] ) ) { // 
didn't already fail in precheck()
+                                       $subStatus = $fileOp->attemptAsync();
                                        if ( $subStatus->value instanceof 
FileBackendStoreOpHandle ) {
                                                $opHandles[$i] = 
$subStatus->value; // deferred
                                        } else {
@@ -189,7 +184,7 @@
                        $statuses = $statuses + 
$backend->executeOpHandlesInternal( $opHandles );
                        // Marshall and merge all the responses (blocking)...
                        foreach ( $performOpsBatch as $i => $fileOp ) {
-                               if ( !$fileOp->failed() ) { // failed => 
already has Status
+                               if ( !isset( $status->success[$i] ) ) { // 
didn't already fail in precheck()
                                        $subStatus = $statuses[$i];
                                        $status->merge( $subStatus );
                                        if ( $subStatus->isOK() ) {
@@ -203,7 +198,5 @@
                                }
                        }
                }
-
-               return $status;
        }
 }

-- 
To view, visit https://gerrit.wikimedia.org/r/111116
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: I7f12ebf711bc196313745f943070f8bdb6335964
Gerrit-PatchSet: 1
Gerrit-Project: mediawiki/core
Gerrit-Branch: wmf/1.23wmf12
Gerrit-Owner: Aaron Schulz <[email protected]>

_______________________________________________
MediaWiki-commits mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to