Repository: beam-site
Updated Branches:
  refs/heads/asf-site a53de48ce -> 3ad83aefd


Fix typo in programming-guide.md

Project: http://git-wip-us.apache.org/repos/asf/beam-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam-site/commit/da5ee690
Tree: http://git-wip-us.apache.org/repos/asf/beam-site/tree/da5ee690
Diff: http://git-wip-us.apache.org/repos/asf/beam-site/diff/da5ee690

Branch: refs/heads/asf-site
Commit: da5ee6907b1d54d32cf66b85dbd23ec9ae668474
Parents: a53de48
Author: Davor Bonaci <[email protected]>
Authored: Thu Feb 2 15:06:29 2017 -0800
Committer: GitHub <[email protected]>
Committed: Thu Feb 2 15:06:29 2017 -0800

----------------------------------------------------------------------
 src/documentation/programming-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/beam-site/blob/da5ee690/src/documentation/programming-guide.md
----------------------------------------------------------------------
diff --git a/src/documentation/programming-guide.md 
b/src/documentation/programming-guide.md
index 9846929..3cd64f0 100644
--- a/src/documentation/programming-guide.md
+++ b/src/documentation/programming-guide.md
@@ -55,7 +55,7 @@ The Beam SDKs provide a number of abstractions that simplify 
the mechanics of la
 
 * `PCollection`: A `PCollection` represents a distributed data set that your 
Beam pipeline operates on. The data set can be *bounded*, meaning it comes from 
a fixed source like a file, or *unbounded*, meaning it comes from a 
continuously updating source via a subscription or other mechanism. Your 
pipeline typically creates an initial `PCollection` by reading data from an 
external data source, but you can also create a `PCollection` from in-memory 
data within your driver program. From there, `PCollection`s are the inputs and 
outputs for each step in your pipeline.
 
-* `Transform`: A `Transform` represents a data processing operation, or a 
step, in your pipeline. Every `Transform` takes one or more `PCollection` 
objects as input, perfroms a processing function that you provide on the 
elements of that `PCollection`, and produces one or more output `PCollection` 
objects. 
+* `Transform`: A `Transform` represents a data processing operation, or a 
step, in your pipeline. Every `Transform` takes one or more `PCollection` 
objects as input, performs a processing function that you provide on the 
elements of that `PCollection`, and produces one or more output `PCollection` 
objects. 
 
 * I/O `Source` and `Sink`: Beam provides `Source` and `Sink` APIs to represent 
reading and writing data, respectively. `Source` encapsulates the code 
necessary to read data into your Beam pipeline from some external source, such 
as cloud file storage or a subscription to a streaming data source. `Sink` 
likewise encapsulates the code necessary to write the elements of a 
`PCollection` to an external data sink.
 

Reply via email to