Re: [PD] Colour in Digital Video

2007-12-21 Thread Andrew Brouse

Hello Matju et al,

You really shouldn't take my comments personally, they aren't meant to be 
at all. I do like you personally and respect the work that you do.


My comments were meant to provide useful information to all those who are 
developing and working with Digital Video in Pd. For what it's worth, the 
references I pointed to are standard fare for all video engineers, whether 
on computers or not.


I find it really sad the amount of bitterness, bickering, infighting and 
acrimony in parts of the Pd community. The energy that goes into this is 
energy that doesn't go into making Pd and associated projects better 
tools. I find that really sad.


That's all I have to say,
Andrew

--

On Fri, 21 Dec 2007, Mathieu Bouchard wrote:


On Wed, 12 Dec 2007, Andrew Brouse wrote:


To further complicate things, our response to brightness and colour is not
linear and so those perceptual curves have to be taken into account during
that mapping into video signals (chroma and luma).


Oh and also all those absolutely perceptual systems are just discarding the 
genetic diversity of the universe (as well as the non-genetic diversity!). 
One man's metamers is another one's yuck. ;)



Most video digitisation systems also use 'chroma sub-sampling'


Yes, we know about it, but I don't think that it's so relevant to colours 
unless using the YUV image type in GEM or PDP. (In GridFlow, YUV images are 
not subsampled, and also, Claude is so far only talking about individual 
colours, as float triplets)


This uses that fact that we are more sensitive to differences in brightness 
than variations in colour.


But this is not counting the fact that we could be standing close enough to 
the screen to actually see that much chroma resolution. No explanation of 
subsampling ever talks about that.


The YCbCr (corresponds the YUV colour space of our perception) video 
representation uses this technique which makes it more compact while giving 
approximately equal quality to an larger RGB representation.


In order to say that, you have to define quality to correspond to some idea 
of how one should look at a screen in order to justify the subsampling. But 
people don't watch a screen from the same distance.


Anyone working with colour video systems should know and read Charles 
Poynton's writing on this subject - he is the acknowledged expert. If you 
only have one book on your shelf it should be his Digital Video and HDTV.


Hey. Colours are nice, but art in general is a lot more important. If I'd 
have only one book on my shelf it should be something else than a treatise of 
colorimetry.



[you can also be amazed, amused or angered by the way he vacillates
between 'color' and 'colour' in his writing...]


None of the above...

_ _ __ ___ _  _ _ ...
| Mathieu Bouchard - tél:+1.514.383.3801, Montréal QC Canada___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] Colour in Digital Video

2007-12-12 Thread Andrew Brouse
Hello Claude, Matju et al,

Colour perception is probably as complicated as auditory perception and 
the accurate representation of colour on digital video systems is probably 
more difficult that the accurate representation of audio. We can perceive 
a much wider range of colour and brightness (chrominance and luminance) 
than the majority of digital video systems can represent, so this 
wide-gamut information has to be mapped to the narrow-gamut of the systems 
which we have.

To further complicate things, our response to brightness and colour is not 
linear and so those perceptual curves have to be taken into account during 
that mapping into video signals (chroma and luma). There are also many 
other non-linearities in how video displays work so that all has to be 
taken into account. Add the complexities of PAL, NTSC and SECAM to that 
and you're set for a party!

Most video digitisation systems also use 'chroma sub-sampling' which is a 
way of reducing the amount of information which needs to be stored (this 
is a first level of compression or coding which occurs before anything 
else like MPEG compression is applied). This uses that fact that we are 
more sensitive to differences in brightness than variations in colour. The 
YCbCr (corresponds the YUV colour space of our perception) video 
representation uses this technique which makes it more compact while 
giving approximately equal quality to an larger RGB representation.

Anyone working with colour video systems should know and read Charles 
Poynton's writing on this subject - he is the acknowledged expert. If you 
only have one book on your shelf it should be his Digital Video and 
HDTV. He does have much useful information also on his web site:
http://poynton.com/

For example, he has a guided tour of colour space here: 
http://poynton.com/papers/Guided_tour/abstract.html

[you can also be amazed, amused or angered by the way he vacillates 
between 'color' and 'colour' in his writing...]

hope that helps,
Andrew

p.s. other notable datum: Poynton taught the very first ever 
microprocessor course given at an art school.
(In the 1970s at the Ontario College of Art in Toronto)



On Wed, 12 Dec 2007, [EMAIL PROTECTED] wrote:

Hi,

Redirecting from GEM-dev as it's not about GEM development...

Mathieu Bouchard wrote:
 if you do a polar transform on YUV, you have something easier, faster
 and more correct all at once. I usually just skip the polar transform:
 if you apply rotations directly on YUV values, you can make very
 believable hue shifts.

Interesting, I'm in the process of experimenting a bit with different
colour spaces, got in a real headache with XYZ and CIE L*a*b and so on,
but YUV's simplicity may win.

 HSV is dubious in part because the apparent brightness at maximum
 so-called value is very variable and seems to peak high or low at
 secondaries or primaries: compare yellow (brightness 89%) and blue
 (brightness 11%). this really makes HSV suck sometimes. YUV does not
 have this problem.

I tried a hybrid approach:

$1 1 1
   |
[hsv2rgb]
   |
[rgb2yuv]
   |
0.5 $2 $3
   |
[yuv2rgb]

and that seems to eliminate the bad brightness mismatches, at the cost
of some colours seeming a bit washed out (blue) or muddy (yellow).

Attached image demonstrates the difference.


Wondering if there's some set of perceptual brightness curves similar
to the isophonic curves [1] there are for perceived loudness of
different frequencies and levels led me to [2], which seems very
complicated again.


___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] code and compilers

2007-12-08 Thread Andrew Brouse

On Sat, 8 Dec 2007, [EMAIL PROTECTED] wrote:

 On Fri, 7 Dec 2007, Hans-Christoph Steiner wrote:

 As for Pd vs. C, there was a time in the not-so-distant past where
 programmers thought that compilers were horribly inefficient, and that
 they were only really good for prototyping things.  Then you'd code
 things for real in assembly.  That lasted well into the 80's.

 It even lasted well into the 90's, but it depends for what. The 80's had
 plenty ofy apps use a blend of asm and C-or-Pascal, while in the 90's
 it became limited to really needy applications (games, demoscene, etc).
 The amount of asm code still being written is shrinking but still somewhat
 present. For example, devel_0_39 has asm code in it.

I once wrote a final exam for a mandatory course in Assembly Language 
programming in the late 1990s where the most important question was to 
determine what a block of Motorola 68k assembly code was doing. People 
were just pulling their hair out! (mine never grew back!). It turned out 
the code was doing basically nothing. That was one of the best CS courses 
I ever took.

An old-school hacker (poet turned progammer, classic!) once told me that 
he used to debug his programmes (on mainframes, with not even 1M of 
memory) by actually just watching a display of activity in all memory 
locations. After a while, he just subconsciously internalised what was 
going on and managed to debug the code.

The questiion today might be whether auto-vectorising compilers are as 
good as hand-unrolled loops (I highly doubt it). The point is well-taken 
however: we should use high-level tools for the rapidity, expressivity and 
clarity they offer us in realising our particular ideas. It never hurts, 
nonetheless, to understand what that code is actually doing on the 
processor and in memory etc.

On a fundamental level, any code we run is just changing patterns of 
electrons bouncing about on a wafer of impure silicon. Thus, when our 
high-level tools fail us or don't give the expected result, we sometimes 
have to dig down a little deeper. Maybe there is a chunk of code, like 
that block of 68k assembly, which is just churning electrons around, 
basicaly doing nothing.

cheers,
Andrew


[...] the introduction of the blackboard has had more impact on classroom 
education than any innovation in technology since, including the 
introduction of cheap paper or the introduction of the internet and 
personal computers

Bill Buxton

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] Max Pd

2007-12-05 Thread Andrew Brouse
Hello Pd and Max folks,

I am doing a presentation (tomorrow!... so this request is a bit late!) on
differences between Max and Pd as tools for music and media art.

I am interested in hearing:

1. from people who actively use both

2. about less-obvious advantages/disadvantages of one or the other

3. specifically about functionality for manipulation of video, OpenGL 
including shaders and matricial data

4. clear, reasoned, articulate thoughts and arguments as to why one or the
other is better or worse for one or another particular use ( why should I 
expect anything else! ;)

This could have some impact on decisions which will be made for a project 
which I can't talk about yet. :)

thanks for your help,
Andrew




___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] gem codec basic review

2007-12-02 Thread Andrew Brouse
Hi Patrick,

As Marius noted, the best CODEC depends on what you are going to do with 
the video. The MPEG gang of CODECs (from MPEG-2 on through H-264) are very 
good for standard playback and have a good quality/file-size ratio. All 
visual CODECs use spatial coding - reduction of perceptual redundancies in 
the 2D plane of a given frame; but another way these CODECs achieve such 
good ratios is by using temporal coding to reduce frame to frame 
redundancies. They take a GOP (Group of Pictures, not an antediluvian 
near-extinct political party) and make decisions about what information is 
important. Some frames are fully represented and other frames are coded 
with reference to the ones around them. This saves a lot of space, but you 
do not have a full representation of each frame. During normal playback 
this is not a problem, but if you want to mess with the playback rate it 
is.

If you want to do any scrubbing, varispeed or playing backwards of the 
video, you need a CODEC which does not use temporal compression. 
Photo-JPEG is in fact a very good choice in this case and if compressed at 
320 X 240 you get a good balance of file-size, picture quality and 
processor load. JPEG 2000 should also be a good choice (supported by 
QuickTime and there are libraries which support it on Linux but not sure 
if it will play in GEM). I have used DV video with good results and this 
may be a good option if you need high quality playback.

However, if you have unlimited bandwidth and storage, why not just go for 
uncompressed D-1 video? (~30M/s) ;)

cheers,
Andrew


On Fri, 30 Nov 2007, [EMAIL PROTECTED] wrote:

 hi,

 i am on linux running the very last version of gem from cvs. i am trying
 to find a good codec for gem. here's my basic research:

 ---
 the best codec for quicktime is jpeg:
 transcode -i yourvideo -y mov,null -F jpeg,,jpeg_quality=70 -o gem.mov

 gem cpu usage is 38%
 mplayer cpu usage is 27%
 ffplay cpu usage is 16%
 lqtplay cpu usage is 1% **

 ---
 for avi i tried ffmpeg / mjpeg:
 transcode -i yourvideo -y ffmpeg,null -F mjpeg -o gem.avi

 gem cpu usage is 33%
 mplayer cpu usage is 33%
 ffplay cpu usage is 22%

 ---
 the really best codec around is mpeg4:
 transcode -i yourvideo -y ffmpeg,null -F mpeg4 -o gem.avi

 gem cpu usage is: crashed**
 mplayer cpu usage is 16%
 ffplay cpu usage is 11%


 * lqtplay seems to make a excellent job for decoding is own codec. would
 it be possible to make a pix_qt based on the source of this player???

 ** anybody can test this codec? that would be my second choice... after
 lqtplay ported to gem...

 pat

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] gem codec basic review

2007-12-02 Thread Andrew Brouse
Hi Max,

JPEG decompression should be very fast on any modern CPU, it is however 
possible that at those frame sizes and 30 FPS, you may be seeing some 
jitter. If you are at 30 FPS, try reducing it to 15 or even 10 FPS and see 
what happens.

cheers,
Andrew

-- 

On Sun, 2 Dec 2007, Max Neupert wrote:


 Am 02.12.2007 um 16:35 schrieb Andrew Brouse:
 If you want to do any scrubbing, varispeed or playing backwards of the
 video, you need a CODEC which does not use temporal compression.
 Photo-JPEG is in fact a very good choice in this case and if compressed at
 320 X 240 you get a good balance of file-size, picture quality and
 processor load. JPEG 2000 should also be a good choice (supported by
 QuickTime and there are libraries which support it on Linux but not sure
 if it will play in GEM).

 hi andrew,

 i wonder if the photo-jpeg compression does introduce a jitter in the 
 decoding speed depending on the complexity of the content. gem would need a 
 different time to decode each time a different frame is called.
 i am working now with 1024x768 video compressed with 75% quality photojpeg 
 and sometimes see lags in the decoding.. but maybe it's just my imagination.

 max

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] Pd performance os OS X

2007-11-23 Thread Andrew Brouse
On Thu, 22 Nov 2007, [EMAIL PROTECTED] wrote:

 I use JACK almost always on OSX with PD (when I use it on OSX at all),
 and still the speed is quite poor. I'm fairly positive it has to do with
 whether Aqua handles the graphical interface or not. All those brushed
 chrome windows and colorful spinning widgets sure are expensive!

This is statement is factually incorrect as noted elsewhere.
Some points to consider:

1. Aqua is the name of the visual theme of OS X 10.0 circa 2001.

2. Quartz (Quartz Extreme if you must) is the window rendering engine on 
OS X and it runs almost entirely on the GPU. The WindowServer process does 
manage some other chores and runs on the CPU uses only about 1% CPU.

3. Core Audio is extremely efficient and a well-written Core Audio app 
will take very little CPU especially when idle.

4. Well-written CLI apps run as fast or faster on OS X as on other Un*ces.

5. Extremely sophisticated audio and video applications do run quite well 
with reasonably low CPU usage on OS X.

Maybe my previous top data was too subtle. A quick check reveals that for 
some reason, just turning audio rendering on soaks up about 20% of the CPU 
and seems to involve a fair bit of thrashing about in the kernel and 
memory. I don't know enough about the internals of Pd to add any more 
insightful comments, but here are some raw figures comparing Pd and iTunes 
CPU and system usage:

Pd

idle audio off: 1.6% CPU, 1000 SysCalls/sec (Mach + BSD)
idle audio on:  21% CPU, 4400 SysCalls/sec (Mach + BSD)
#22 patch open: 29% CPU, 4700 SysCalls/sec (Mach + BSD)
#22 patch w GemWin: 34% CPU, 5300 SysCalls/sec (Mach + BSD)
#22 patch running:  46% CPU, 5300 SysCalls/sec (Mach + BSD)

iTunes

idle not playing:   0.0% CPU, 5-20 SysCalls/sec (Mach + BSD)
playing:9.3% CPU, 900 SysCalls/sec (Mach + BSD)

This could have to do with PortAudio as the glue to CoreAudio, I don't 
know.

Thomas Grill was quite right. What is needed is for those with the time 
and motivation to use the performance profiling tools (such as Shark, 
Sampler, BigTop etc.) which come with Apple's free developer tools and dig 
into Pd to discover what is chewing up CPU cycles. Here are some guides:

Performance Guides:
http://developer.apple.com/documentation/Performance/index.html

Performance Tools:
http://developer.apple.com/documentation/DeveloperTools/Performance-date.html#//apple_ref/doc/uid/TP3440-TP3436-TP3901


cheers,
Andrew

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] performance on osX

2007-11-22 Thread Andrew Brouse
salut Cyrille,


Maybe the following will be helpful.
(MacBook Pro, 2.4 GHz Core 2 duo, OS 10.5.1, 2GB ram)

/Applications/Pd-0.40.3-extended-20071117.app running with no patch open:

   PID COMMAND  %CPU   TIME   #TH #PRTS #MREGS RPRVT  RSHRD  RSIZE VSIZE
  4784 pd  21.3%  0:37.54   698288 K  4152K12M   40

/Applications/Pd-0.40.3-extended-20071117.app with example pmpd #22 open 
and gemwin open still not running:

   PID COMMAND  %CPU   TIME   #TH #PRTS #MREGS RPRVT  RSHRD  RSIZE  VSIZE
  4784 pd  23.5%  2:09.90   6   129422   10M16M23M   414M

/Applications/Pd-0.40.3-extended-20071117.app with example pmpd #22 open 
and running:

   PID COMMAND  %CPU   TIME   #TH #PRTS #MREGS RPRVT  RSHRD  RSIZE  VSIZE
  4784 pd  39.1%  3:32.09   6   129423   11M13M23M   414M

Andrew

 i just made a performance test on osX, using pmpd exemple 22 (i had 
 similar
 problem with other patch, this is just an exemple)

 with pd extended, or pd/Gem and pmpd compiled from cvs, this patch use 
 about
 40% CPU on a macbook pro 2.33Ghz.

  on my computer (dual core 2Ghz), it use about 22% (in both case, % are
 mesure for only 1 CPU : both computer are dual core)
 i did not do any thing special in my computer to optimise compilation, 
 nor in the macbook.

 could anyone with a macbook pro could test this patch and tell me the 
 CPU used?

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] directory layout for pd-extended on OS X

2007-11-17 Thread Andrew Brouse
On Sat, 17 Nov 2007, [EMAIL PROTECTED] wrote:

 Actually, that reminds me, I would like to have the Pd-extended
 package use a directory for people to install their own externals.
 On Mac OS X, it's /Library/Pd following those conventions.  I am
 thinking that /usr/local/lib/pd/ should be the user-installed stuff,
 with everything going in there (i.e. help patches, binaries, and .pd
 files).  Then the .deb should install into /usr/lib.  This means it
 would conflict with the 'puredata' package that's included in Debian,
 but it's the proper way to do it.

 .hc

If you wanted to truly follow the OS X directory layout, externals and 
such installed by the Application should be in:
/Library/Applcation Support/Pd

while user-installed externals etc. would go in:
~/Library/Applcation Support/Pd

Which would allow different users on the same system to have different 
configurations.

BTW, extended is looking really nice on Leopard, no crashing here. :)
Thanks for the hard work.

Andrew

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] Bbang ~ Max/Pd users group in Brussels

2007-10-29 Thread Andrew Brouse
Dear Pure Data and Max programmers,

A new users group is being set up in Brussels for those interested in  
Max and Pd.

Yves Bernard of iMAL has kindly set up a mailing list to further this  
end, more info is here:
http://www.imal.org/mailman/listinfo/bbang

Anyone is welcome to subscribe and those interested in participating  
in this group are highly encouraged to do so.
Pd and Max enthusiasts from within the surrounding high-speed rail  
neighbourhood are especially welcome.

We are looking towards planning a first meeting sometime before the  
end of this year with following meetings happening approximately  
monthly.

best regards,
Andrew Brouse

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] [OT] about sexism

2007-10-22 Thread Andrew Brouse
Hello Pders,

I just wanted to pick up where Alex left off and provide some more  
statistics regarding gender balance in the papers sessions. Here are  
some raw and slightly warmed-over stats:

===

Number of submissions received for the papers sessions :

44*

Number of submissions received for the papers sessions from men:

41**
or
42**

Number of submissions received for the papers sessions from women:

3**
or
2**

Percentage of submissions from women as a percentage of the whole:

3 / 44 * 100 = 6.81%
or
2 / 44 * 100 = 4.54%

*(the count actually went to 46, but two were just tests)
**(this is based on the name of the listed principal author, and  
there is one name I am not sure about, I show the numbers for both  
possible cases.)

===

Number of submissions which were accepted:

26

Percentage of submissions which were accepted as a percentage of the  
whole:

26 / 44 * 100 = 59.09%

Number of submissions from women which were accepted:

2

Percentage of submissions from women which were accepted:

2 / 3 * 100 = 66.6%
or
2 / 2 * 100 = 100%

Number of women who were accepted and who did attend and present at  
PdCon:

1

===

One of the clearly agreed and accepted principles shared by all  
members of the executive and steering committees for PdCon was that  
we wanted to enable - as much as possible - the participation of  
women, minorities and people from outside of the euro-centric  
mainstream.

Given those precepts, we did our best with what we had to work from.
Were we successful? I'll leave that for others to judge.

It is clear, in any case, that for these sorts of gatherings in the  
Pd community, there is still a long way to go.

As an old gauchiste, I have always believed that the ultimate goal  
for human society was for all people to be considered and treated as  
equals - in all aspects of life.

I also always understood the notion of Politically Correct' - a  
critical term which originated in the left, but has now been co-opted  
by the right - to indicate the tendency, of some, to be more  
interested in linguistic semantics than material reality.

I am interested in material reality.
I am interested in seeing redress, actual material change  - in our  
society, in our world - for those who have been materially  
disfavoured. I do what I can to help that process along.

At the same time, there are real differences between women and men.
There is no point trying to minimise those differences.
Women are equal, but different.

What a beautiful difference.


Andrew




On 21-Oct-07, at 8:25 PM, [EMAIL PROTECTED] wrote:

 Message: 3
 Date: Sun, 21 Oct 2007 12:13:51 -0400 (EDT)
 From: Alexandre Castonguay [EMAIL PROTECTED]
 Subject: Re: [PD] [OT] Re: about sexism is TERMINATE THREAD PLEASE
 To: [EMAIL PROTECTED]
 Cc: pd-list@iem.at
 Message-ID: [EMAIL PROTECTED]
 Content-Type: text/plain;charset=iso-8859-1

 Hi all, Yves,

 Here are some facts may help explain and paint a correct picture of  
 the
 convention's gender distribution.

 Number of applications received for the exhibition component :

 9 (F)
 26 (M)

 Invitations sent :

 6 (F)
 12 (M)

 Number of applications for performances (* I am unsure as to the  
 gender of
 one applicant as we didn't ask people to specify it in their  
 application.)

 3 (F)
 32 (M)

 Invitations sent :

 2 (F)
 18 (M)

 As Andrew Brouse noted, the applications for papers did not carry the
 author's names so it makes it hard to get a picture of the gender
 breakdown. Out of 46 'papers invitations', 2 were extended to women  
 and I
 believe that may unfortunately be the number of applicants?  Andrew  
 may be
 able to answer to that.

 I believe that the impression Yves got is justified.  It is just  
 that the
 community is overwhelmingly male and 'white' (another thread!).  It  
 also
 seems that the juries for the papers, exhibition and performances were
 conscious of the fact as it is somewhat reflected in the final  
 breakdown
 of invitations sent.

 Some observations on other parts of this long thread that may yet  
 yield
 something positive.

 * The component of the convention that had the highest  
 representation of
 women applicants was the exhibition. It shows that this form of
 contribution is often the way through which women enter the  
 community. It
 should be maintained and expanded through other conventions.

 * I heard through the application process that some women were  
 intimidated
 by the perceived technological sophistication of the pd scene and  
 thought
 that their work may not be 'pure' enough to warrant an application. In
 that light, dismissing people whose work process calls on external
 expertise to be realized does not help with that perception of purity.

 * I am personally glad that our efforts of providing better  
 documentation
 and access to the software got a renewed push through the work  
 groups and
 discussions happening at the convention.

 * Building a more representative community will take time

[PD] PDCon07 peer review process

2007-10-02 Thread Andrew Brouse
On 30-Sep-07, at 5:45 PM, [EMAIL PROTECTED] wrote:

 yeh i think people understand me when i complain
 about MB's spam, and also, about the fact that the pd
 convention was MB convention
 ( of course, when you're in the convention comittee
 and that you worked previously
 with all the curators out there ),

This was brought to my attention and I feel I need to respond to it.

I have no interest in getting involved in personal squabbles here.
(I suggest you take such things 'out back': off-list.)

I do, however, take very great exception with the imputation that the  
selection processes for PDCon07 artistic events or papers sessions  
were done in anything less than strict conformance with established  
international norms for such things.

A couple of clarifications:

Matthieu Bouchard was on the steering committee, not the executive,  
he did not have a say in final decisions which were made.

Everyone who worked on PDCon07 worked extremely diligently for long  
hours over the period of a year. The amount of hours which I put in  
caused me some serious problems where I work and with my ongoing PhD  
thesis. Organising the papers sessions often felt like a completely  
thankless job and I had to question myself on a regular basis whether  
it was all worth it. To have it suggested flippantly - after all the  
blood, sweat and tears - that the selection process was somehow  
rigged or not fair is very distressing. It really hurts. I'm sure the  
other members of the executive feel the same way.

At the end of the day, I justified it to myself this way: free and  
open-source tools like Pd are increasingly essential for musicians  
and artists to do their work. As someone who used to work in  
sculpture with wood, stone and steel, I know intimately that the  
quality of the work you do depends - amongst other factors - on  
having reliable tools. The same is true for computer music and media  
art. Commercial tools for computer music and media art are becoming  
increasingly expensive and unreliable at the same time. We need tools  
like Pd. At the same time it seems to me that Pd is at a crucial  
stage in its evolution and significant changes are happening. So, the  
papers sessions were seen as a way of enabling the important  
discussions which need to happen to go forward. I wanted to put the  
ideas proposed by the diverse members of the Pd community into a  
context of calm, respectful - yet vigourous - debate and discussion.  
I wanted to put the best ideas into the clear light;  and let those  
ideas be seen and discussed based on their own merits. To enable  
this, the selection process had to be as fair and transparent as  
possible, and so it was.

As I had nothing to do with the artistic selection process I cannot  
comment on that.
I was, however, the papers chair and did manage that process so I  
will outline to you exactly how the decisions were made:

0. Calls for papers were made and 44 submissions were made via the  
web interface.

1. Suggestions for potential peer reviewers were requested from  
within the local Pd community.

2. A list of potential reviewers was compiled and invitations were made.

3. Slightly less than half of those invited did accept.

4. Submissions were assigned to reviewers with at least 3 reviewers  
per paper.
(there was one exception to this as one of our reviewers bailed out  
at the last minute, this particular case was brought to the executive  
for discussion and the paper was in fact accepted)

5. Reviews were compiled and a cut-off score was decided (4.0 of a  
possible 6.0). All papers above this threshold were accepted. (Note:  
Matthieu Bouchard was listed as principle or secondary author on 4  
submissions, 3 of which were accepted, all with a score of 4.0 or  
higher.)

6. A small number of papers below this threshold which broached  
subjects judged to be important to the Pd community.were accepted for  
reasons of universal interest and thematic consistency. A total of 26  
papers were finally accepted.

7. The authors were informed of the results and comments from the  
reviewers were passed on.

8. Most authors re-submitted revised versions of their papers taking  
into account the comments of the reviewers.

9. Most authors did come and present at PDCon with a small number  
unable to attend due to financial or other issues.


Anyone who has any questions, issues or complaints about the papers  
review process for PDCon07, please contact me directly at:
[EMAIL PROTECTED]

Anyone who has any questions, issues or complaints about the artistic  
review process for PDCon07, please contact Marc Fournel directly at:
[EMAIL PROTECTED]

Thanks once again to all who participated in PDCon07 and especially  
to those who submitted papers presented at PDCon and helped so  
greatly by reading and reviewing submissions for us.


best regards,
Andrew Brouse
Louvain-la-Neuve, Belgium

=

Andrew Brouse
PDCon07 Papers Chair
[EMAIL PROTECTED]
http://pure

[PD] PDCon Last Chance to submit a paper

2007-05-01 Thread Andrew Brouse
Dear PD folks,

Much as the official deadline has passed to submit papers for PDCon,  
the web interface will remain open for 1 more day (until the rising  
of the full moon).

Submissions interface is here:
http://pure-data.ca/openconf.php

Guidelines and templates are here:
http://puredata.org/community/projects/convention07/guidelines#papers

The review process will begin soon, so this is your last chance.
All it takes is a 1 page abstract, you can also modify your  
submission once you have made it.

[this message only posted to PureData community lists]

thanks,
Andrew Brouse


___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


Re: [PD] PureData Convention Papers Abstracts ...

2007-04-16 Thread Andrew Brouse
 shouldnt be the dates be updated on the webpages like:

 Guidelines with updated templates are here:
 http://www.puredata.org/community/projects/convention07/guidelines#papers

The official dates are on the official call here:
http://pure-data.ca/call.html

For the meantime, the submission form is still open (wink, wink):
http://pure-data.ca/openconf.php

I am too busy writing grants to allow the convention to exist to verify 
that dates everywhere are synchronised. If someone wants to send me 
specific URIs where they are inconsistent, I will get around to making the 
change once the grants are finished.

One question for the PD community: can anyone give me an estimate of the 
number of users/developers worldwide? in your country (Austria, Finland 
etc.)? How many people are on the PD list?

thanks very much.

yours,
Andrew

___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] Subject: PureData Convention Papers Abstracts due soon

2007-04-15 Thread Andrew Brouse
Hello PDers,

This is just a gentle reminder that abstracts are soon due for those who 
wish to present papers at PDCon 2007.

I know there are lots of interesting things going on out there, so please 
do share with your colleagues!

All it takes is a 1 page (max) abstract right now. If accepted, you then 
have until June 30 to finish the paper.

Depending on how our grant applications go, there may also be some 
possible funding for travel etc.

The papers call is here:
http://pure-data.ca/

Guidelines with updated templates are here:
http://www.puredata.org/community/projects/convention07/guidelines#papers

Submission form is here:
http://pure-data.ca/openconf.php

More information about the convention is here:
http://www.puredata.org/community/projects/convention07/

thanks,
Andrew


___
[EMAIL PROTECTED] mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] [OT] eNTERFACE07 : workshop in biologically controlled audio-visual synthesis

2007-04-11 Thread Andrew Brouse
eNTERFACE07 : workshop in biologically controlled audio-visual synthesis

Dear Colleagues,

This summer, the third eNTERFACE'07 summer workshop in multimodal 
human-computer interfaces will take place from July 16th-August 10th on 
the campus of Bogazici (Bosporus) University in Istanbul.

The eNTERFACE summer workshops aim at establishing a tradition of 
collaborative, localized research and development by gathering together 
academics, researchers, and students - for a period of one month - to work 
on specific challenges relating to multimodal interfaces.

For more information, please take a look at the web site:
http://www.enterface.net/enterface07/

One of the projects this year proposes to continue previous research into 
the use of physiological signals (EEG, EMG, ECG etc.) to control digital 
sound and image synthesis processes.

We plan to expand upon previous work into Biologically-Driven Musical 
Instruments by further investigating the - medical, scientific and 
aesthetic - uses of Brain-Computer Interfaces and other modes of 
biological control over computing machinery.

For more detailed information on this project, please see the following:
http://www.cmpe.boun.edu.tr/enterface07/callForProjects.php#p11

This workshop is funded by the European Union's SIMILAR 
(http://www.similar.cc/) Network of Excellence and participants only have 
to pay airfare and lodging costs, for which some grants may be available. 
Please indicate in your application if you wish to be considered for this 
additional funding.

To submit an application for consideration, please send your CV and a 
short statement of interest which mentions your motivation for 
participating and the skills which you can bring to the project. 
Applications should be sent ASAP to: [EMAIL PROTECTED] and 
[EMAIL PROTECTED]


___
PD-list@iem.at mailing list
UNSUBSCRIBE and account-management - 
http://lists.puredata.info/listinfo/pd-list


[PD] PDCon Papers Call v.2 - deadline extended to April 15

2007-04-02 Thread Andrew Brouse

Dear PureData people,

Following is the 2nd call for abstracts and papers - with an extended deadline 
- for the PureData Convention. Just in time for the full moon ;)


Any queries can be directed to [EMAIL PROTECTED]

Andrew

--

[version française à la suite]

=

L'œuvre ouverte | PureData Convention '07
* August 21st-26th 2007, Montréal Canada *

2nd Call for Papers - Deadline extended

New Deadline for Abstracts: April 15th 2007
Acceptance announced: May 31
Camera-ready papers due: June 30

More information:
http://www.puredata.org/community/projects/convention07/

Papers submission form:
http://pure-data.ca/openconf.php

The PureData Convention Steering Committee is now accepting abstracts and full 
papers to be considered for inclusion in L'œuvre ouverte (the Open Work), the 
2nd International PureData Convention. L'œuvre ouverte invites an open relation 
between the artwork and the public as well as open attitudes and practices in 
the fields of computer programming, artistic creation and scientific research. 
It will bring together artists, writers, programmers and others who develop, 
use and reflect on PureData and Open-Source culture.


In order to provide a theoretical and artistic context for the understanding of 
the aesthetics and politics of Free and Open-Source software culture, papers 
from widely divergent perspectives are encouraged.


Abstracts and papers may be submitted in either French or English. There will 
be three primary papers streams: Technical, Artistic and Theoretical.


Some possible themes could include:

- programming PureData externals
- PureData versus Max/MSP
- interfacing with sensors and actuators
- real-time responsive, immersive and interactive environments
- hacking PureData
- using PureData in installations
- use of PureData in live performance
- sampling and Plunderphonics as artistic strategies
- re-evaluating Umberto Eco's The Open Work
- OpenContent and Creatve Commons versus Intellectual Property
- Hactivist culture
- using PureData in science and research
- finding an Art Historical context for Open-Source Software Art
- physical computing interfaces
- biological and medical uses of PureData
- et cetera


please read the application guidelines:
http://www.puredata.org/community/projects/convention07/guidelines/

and submit your abstract of full paper via the online form before April 15, 
2007:

http://pure-data.ca/openconf.php

Questions can be directed to: [EMAIL PROTECTED]

=

L'œuvre ouverte | Convention PureData '07
* 21 au 26 août 2007 | Montréal Canada *

deuxième appel de exposés - date de tombée prolongée

nouvelle date de tombée pour abstraites : 15 avril, 2007
annonce des résultats : 31 mai, 2007
date de rendu finale : 30 juin, 2007

pour plus d'information:
http://www.puredata.org/community/projects/convention07/

formulaire de soumission en-ligne:
http://pure-data.ca/openconf.php

La comité organisateur de la Convention PureData '07 accepte maintenant des 
abstraits d'exposés destinés à être inclus dans L'œuvre ouverte, la 2ième 
Convention International PureData. L'œuvre ouverte invite une relation ouverte 
entre L'œuvre d'art et le grand public même des attitudes et pratiques ouvertes 
dans les champs de la programmation d'ordinateur, la création artistique et la 
recherche scientifique. La convention réunira les artistes, les ecrivain(e)s, 
les programmeurs et bien d'autres que développent, utilisent et réfléchissent 
sur PureData et la culture de «Open-Source».


En tant de donner un contexte théorique et artistique pour la compréhension des 
enjeux esthétiques et politiques de la culture de logiciel libre, des exposés 
provenantes des perspectives divergentes seront encouragées.


Des abstraites et des exposés peuvent être soumises sois en français ou en 
anglais. Il y aura trois rubriques de thématique principale: technique, 
artistique, théorique.


Des possibilités de sujets abordés peuvent inclure:

- la programmation d'externals PureData
- PureData contre Max/MSP
- les interfaces avec des sensors et des actuateurs
- des environnements temps-réel, immersive et interactive
- PureData pour des «hackers»
- installations avec PureData
- PureData en direct
- échantillonnage et «Plunderphonics» comme stratégies artistiques
- un nouveau volet sur «L'œuvre ouverte» de Umberto Eco
- «OpenContent» et «Creatve Commons» contre la propriété intellectuelle
- la culture des «Hactivists»
- PureData utilisé dans la science et la recherche
- trouver un contexte dans l'histoire d'art pour l'art de logiciels libres
- les interfaces physiques vers la computation
- les usages biologiques et médicales de PureData
- et cetera


Lire attentivement les notes d'instruction:
http://www.puredata.org/community/projects/convention07/guidelines/

Soumettre votre abstraite ou votre exposé complet par le formulaire en-ligne 
avant le 15 avril, 2007:

http://pure-data.ca/openconf.php

Pour des questions: