jeyzu pushed a commit to branch master.

http://git.enlightenment.org/website/www-content.git/commit/?id=80ed35d1d6d34e64e3b84afb30d11d3f9f535e34

commit 80ed35d1d6d34e64e3b84afb30d11d3f9f535e34
Author: Jérémy Zurcher <jeremy.zurc...@heraeus.com>
Date:   Wed Nov 15 11:35:01 2017 +0100

    fix bin->hex->dec : it has been hauting me for too long
---
 pages/docs/c/start.txt | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/pages/docs/c/start.txt b/pages/docs/c/start.txt
index b0d3d2fa..d7ccc6ec 100644
--- a/pages/docs/c/start.txt
+++ b/pages/docs/c/start.txt
@@ -98,8 +98,8 @@ CPUs will do arithmetic, logic operations, change what it is 
they execute, and r
 To computers, numbers are a string of "bits". A bit can be on or off. Just 
like you may be used to numbers, with each digit having 10 values (0 through to 
9), A computer sees numbers more simply. It is 0, or it is 1. Just like you can 
have a bigger number by adding a digit (1 digit can encode 10 values, 2 digits 
can encode 100 values, 3 can encode 1000 values etc.), So too with the binary 
(0 or 1) numbering system computers use. Every binary digit you add doubles the 
number of values you [...]
 
 ^Binary           ^Hexadecimal ^Decimal ^
-|101              |d           |14      |
-|00101101         |2d          |46      |
+|101              |5           |5       |
+|00101101         |2d          |45      |
 |1111001101010001 |f351        |62289   |
 
 Numbers to a computer normally come in sizes that indicate how many bits they 
use. The sizes that really matter are:

-- 


Reply via email to