Re: [webkit-dev] Is anyone really using IPP

2012-03-05 Thread Chris Rogers
On Mon, Mar 5, 2012 at 3:02 PM, Benjamin Poulain benja...@webkit.orgwrote:

 Hello,

 I have seen a few patches from Intel to add support of Intel IPP for
 some algorithm. A quick search make me think nobody enable this code.

 Is anyone really using IPP? My concern is if nobody uses/tests it, we
 might be adding dead code to WebKit.

 Benjamin


Hi Benjamin,

The chromium port is currently evaluating using this code, work which is in
active development right now, so it's not stale or dead code.  I'm quite
concerned about testing and am quite careful to make sure that the IPP
patches are run against the webaudio layout tests before landing.

Chris
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Time to move branches/audio to branches/old/audio?

2011-10-20 Thread Chris Rogers
That seems fine to me.

On Thu, Oct 20, 2011 at 4:48 PM, Adam Barth aba...@webkit.org wrote:

 Looks like branches/audio hasn't been changes in almost a year and
 WebAudio appears to be fully merged to trunk.  Should we move
 branches/audio to branches/old/audio?

 Thanks,
 Adam

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] XHR responseArrayBuffer attribute: possible implementation

2010-10-25 Thread Chris Rogers
passing undefined as the 4th and 5th arguments seems pretty clunky to me.
 Since, we already have an asBlob attribute, then asArrayBuffer like
Darin suggests seems like it might be better.  However, then we can get into
cases where both asBlob and asArrayBuffer are set, and this problem
would get even worse as new types are added.  So, an attribute called
responseType might be a good solution.  This would be instead of passing
it as an argument to open(), and instead of having an asBlob attribute.
 This approach seems the cleanest to me, and I'll propose it to the
appropriate standards group.

Chris

On Mon, Oct 25, 2010 at 12:45 PM, Alexey Proskuryakov a...@webkit.org wrote:


 25.10.2010, в 12:34, Chris Marrin написал(а):

 request.open(GET, data.xml, true, Text);
 request.open(GET, data.xml, true, XML);
 request.open(GET, data.xml, true, Bytes);


 I'd sure like to try to avoid an explosion in the API. I like Geoff's
 suggestion of specifying the type of request in open(). Seems like the best
 API would be to have Geoff's API and then:


 Note that open() has username and password as its 4th and 5th arguments.
 So, you'd have to call it like request.open(GET, data.xml, true,
 undefined, undefined, Bytes);

 - WBR, Alexey Proskuryakov


 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] XHR responseArrayBuffer attribute: possible implementation

2010-10-25 Thread Chris Rogers
True, it's not as simple as checking for the existence of an attribute, but
it could throw an exception. It seems like having both asBlob and
asArrayBuffer also creates the possibility of bad combinations of settings
which could get confusing, especially if more types are added in the future.

Chris

On Mon, Oct 25, 2010 at 3:04 PM, Darin Fisher da...@chromium.org wrote:

 How do you address the discoverability issue that I raised?  asBlob and
 asArrayBuffer have the benefit of being detectable at runtime.  but, a
 settable responseType does not support detection of supported values.

 -Darin



 On Mon, Oct 25, 2010 at 2:54 PM, Chris Rogers crog...@google.com wrote:

 passing undefined as the 4th and 5th arguments seems pretty clunky to
 me.  Since, we already have an asBlob attribute, then asArrayBuffer like
 Darin suggests seems like it might be better.  However, then we can get into
 cases where both asBlob and asArrayBuffer are set, and this problem
 would get even worse as new types are added.  So, an attribute called
 responseType might be a good solution.  This would be instead of passing
 it as an argument to open(), and instead of having an asBlob attribute.
  This approach seems the cleanest to me, and I'll propose it to the
 appropriate standards group.

 Chris

 On Mon, Oct 25, 2010 at 12:45 PM, Alexey Proskuryakov a...@webkit.orgwrote:


 25.10.2010, в 12:34, Chris Marrin написал(а):

  request.open(GET, data.xml, true, Text);
 request.open(GET, data.xml, true, XML);
 request.open(GET, data.xml, true, Bytes);


 I'd sure like to try to avoid an explosion in the API. I like Geoff's
 suggestion of specifying the type of request in open(). Seems like the best
 API would be to have Geoff's API and then:


 Note that open() has username and password as its 4th and 5th arguments.
 So, you'd have to call it like request.open(GET, data.xml, true,
 undefined, undefined, Bytes);

  - WBR, Alexey Proskuryakov


 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] XHR responseArrayBuffer attribute: possible implementation

2010-10-22 Thread Chris Rogers
A few weeks ago I brought up the idea of implementing the
responseArrayBuffer attribute for XHR:
http://dev.w3.org/2006/webapi/XMLHttpRequest-2/#the-responsearraybuffer-attribute

One of the concerns was that it might require double the memory usage since
the raw bytes would have to be accumulated along with the decoded text as
it's being built up.  One possible solution which I've been discussing with
James Robinson and Ken Russell is to defer decoding the text, and instead
buffer the raw data as it comes in.  If there's any access to responseText
(or responseXML), then the buffered data can be decoded into text at that
time, and the buffered raw data discarded.  If that case happens, then from
that point on no raw data buffering would happen and the text would be
accumulated as it is right now.  Otherwise, if responseText is never
accessed then the raw data continues to buffer until it's completely loaded.
 Then an access to responseArrayBuffer can easily convert the raw bytes to
an ArrayBuffer.

The idea is that once responseText or responseXML is accessed, then it would
no longer be possible to access responseArrayBuffer (an exception would be
thrown).
Conversely, once responseArrayBuffer is accessed, then it would no longer be
possible to use responseText or responseXML (an exception would be thrown).
This approach does seem a little strange because of the mutually exclusive
nature of the access.  However, it seems that it would be hard to come up
for a reasonable use case where both the raw bytes *and* the text would be
needed for the same XHR.

How does this sound as an approach?

Chris
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] XHR responseArrayBuffer attribute

2010-09-24 Thread Chris Rogers
I've noticed that the responseArrayBuffer attribute has recently been added
to the XMLHttpRequest-2 specification:

http://dev.w3.org/2006/webapi/XMLHttpRequest-2/#the-responsearraybuffer-attribute

I was interested to know if anybody was planning on implementing that
attribute soon.  If not, I would like to add this myself.

Regards,
Chris Rogers
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] XHR responseArrayBuffer attribute

2010-09-24 Thread Chris Rogers
If we added xhr.asArrayBuffer, what would happen if xhr.asBlob was also set?
 Don't we really want something like xhr.loadAsType with different enum
values for text, blob, array buffer, etc.?

On Fri, Sep 24, 2010 at 5:19 PM, Michael Nordman micha...@google.comwrote:

 With xhr.responseBlob we chose to have the caller decide up front and tell
 the xhr object how it would like the response by setting the xhr.asBlob
 attribute prior to calling send(). We could do the same with
 xhr.asArrayBuffer.

 On Fri, Sep 24, 2010 at 5:09 PM, Alexey Proskuryakov a...@webkit.orgwrote:


 24.09.2010, в 16:37, Chris Rogers написал(а):

  I was interested to know if anybody was planning on implementing that
 attribute soon.  If not, I would like to add this myself.

 The key problem to solve is how to not double the memory use of the
 XMLHttpRequest object, while not making responseText and responseXML slow.

 See also: https://bugs.webkit.org/show_bug.cgi?id=40954. Do we need
 both responseBody and responseArrayBuffer?

 - WBR, Alexey Proskuryakov

 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] Web Audio API

2010-08-24 Thread Chris Rogers
Over the past months I've been refining the web audio API implementation
that I've been developing in the 'audio' branch of WebKit (per Maciej's
recommendation).  The API has been through a good amount of review by WebKit
developers at Apple, Google, and in the W3C Audio Incubator  group.  For
those who are interested, the draft specification is here:
http://chromium.googlecode.com/svn/trunk/samples/audio/specification/specification.html

I have working demos here:
http://chromium.googlecode.com/svn/trunk/samples/audio/index.html

I'll be posting a series of patches to migrate the working code from the
audio branch to WebKit trunk.  Most of the files are new, with only a few
places which will touch existing WebKit files (such as EventTarget, Event).
 The files will be conditionally compiled.  I'm considering using the
following enable:

#if ENABLE(AUDIOCONTEXT)

After discussing the directory layout in some detail with Eric Carlson,
Chris Marrin, Simon Fraser, and Jer Noble, we've decided that the files will
primarily live in two places:

WebCore/audio
WebCore/platform/audio

I know that some had expressed concern that a directory called 'audio' in
WebCore would be confused with the audio element.  The reason I think
'audio' would be a good name is because the API does have a direct
relationship to the audio element and, over time, when the API becomes more
broadly used will be associated with the audio capabilities of the web
platform.  That said, if anybody has grave concerns over this name, then we
can discuss alternatives.

Anyway, I just wanted to bring these coming changes to everyone's attention.

Regards,
Chris Rogers
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Web Audio API

2010-08-24 Thread Chris Rogers
Hi Chris,

That also sounds like a reasonable naming scheme.  The only counter-argument
I would have is that we have several directories in WebCore which don't have
the 'web' prefix such as:

WebCore/notifications
WebCore/storage
WebCore/workers

(and not webnotifications, webstorage, webworkers)

I guess I'm just trying to keep to a simpler naming convention.  Since
WebKit is all about the web, it seems like 'web' is implied.

Either way is fine with me, but I have a preference for the simpler 'audio'.

Chris


On Tue, Aug 24, 2010 at 3:10 PM, Chris Marrin cmar...@apple.com wrote:


 On Aug 24, 2010, at 12:05 PM, Chris Rogers wrote:
  #if ENABLE(AUDIOCONTEXT)
 
  After discussing the directory layout in some detail with Eric Carlson,
 Chris Marrin, Simon Fraser, and Jer Noble, we've decided that the files will
 primarily live in two places:
 
  WebCore/audio
  WebCore/platform/audio

 
  I know that some had expressed concern that a directory called 'audio' in
 WebCore would be confused with the audio element.  The reason I think
 'audio' would be a good name is because the API does have a direct
 relationship to the audio element and, over time, when the API becomes more
 broadly used will be associated with the audio capabilities of the web
 platform.  That said, if anybody has grave concerns over this name, then we
 can discuss alternatives.

 I'd rather see the directories named webaudio and the enabled named
 WEBAUDIO. This would match the naming of 'websockets' (although not web
 workers, which is simply named 'workers'. I agree that this is directly
 related to the audio element, but it is an optional piece (hence the enable
 flag) and so I think it should have its own naming.

 -
 ~Chris
 cmar...@apple.com





___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Web Audio API

2010-08-24 Thread Chris Rogers
Hi Simon,

#if WEBAUDIO is fine.

Do you also prefer WebCore/webaudio like Chris Marrin, or WebCore/audio?

Chris


On Tue, Aug 24, 2010 at 4:04 PM, Simon Fraser simon.fra...@apple.comwrote:

 On Aug 24, 2010, at 12:05 PM, Chris Rogers wrote:

 Over the past months I've been refining the web audio API implementation
 that I've been developing in the 'audio' branch of WebKit (per Maciej's
 recommendation).  The API has been through a good amount of review by WebKit
 developers at Apple, Google, and in the W3C Audio Incubator  group.  For
 those who are interested, the draft specification is here:

 http://chromium.googlecode.com/svn/trunk/samples/audio/specification/specification.html

 I have working demos here:
 http://chromium.googlecode.com/svn/trunk/samples/audio/index.html

 I'll be posting a series of patches to migrate the working code from the
 audio branch to WebKit trunk.  Most of the files are new, with only a few
 places which will touch existing WebKit files (such as EventTarget, Event).
  The files will be conditionally compiled.  I'm considering using the
 following enable:

 #if ENABLE(AUDIOCONTEXT)


 Didn't we decide that WEBAUDIO was a better #ifdef?

 Simon


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Web Audio API

2010-08-24 Thread Chris Rogers
Hi Simon, thanks for helping here.

By the way, anybody who is interested can look at the files in:
https://svn.webkit.org/repository/webkit/branches/audio/WebCore/audio/

There are actually a number of audio files which could be considered
re-usable, although I know of nothing at this time other than the web audio
API which would use them:

AudioBus.cpp
Biquad.cpp
Reverb.cpp
FFTConvolver.cpp (and other FFT-related files)

possibly also in this category are:
Cone.cpp
Distance.cpp
MidSide.cpp
SinWave.cpp
Panner.cpp (and subclasses)
(maybe a few others I've missed)

Basically, these are the lowest-level building blocks which the higher-level
parts (such as AudioContext, and AudioNode) use.  These lowest-level
building blocks do not (or should not) have any dependencies to the
higher-level code which implement the actual API (and have IDL files).  They
also don't have any dependencies on other parts of WebCore, although they do
use stuff in wtf.

So are you suggesting:

WebCore/webaudio--- IDL files and API implementation
WebCore/platform/audio--- lower-level building blocks such as
AudioBus.cpp

then with the conditional

#if WEB_AUDIO- I added an underscore here

Chris

On Tue, Aug 24, 2010 at 4:22 PM, Simon Fraser simon.fra...@apple.comwrote:

 On Aug 24, 2010, at 4:15 PM, Chris Rogers wrote:

  Hi Simon,
 
  #if WEBAUDIO is fine.
 
  Do you also prefer WebCore/webaudio like Chris Marrin, or WebCore/audio?

 I am ambivalent. Everything in WebCore is ultimately web-related, so 'web'
 prefixes on the directories seem redundant.

 One direction would be to use /webaudio for a directory that contains files
 specifically related to the API, and /audio for platform directories that
 contain audio-related code that could be reused for other purposes.

 Simon


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Web Audio API

2010-08-24 Thread Chris Rogers
Good, it looks like we're getting close.  So we've agreed to how the files
should be split up, but Darin Fisher still was concerned about the 'web'
prefix.

Darin, was it the directory name WebCore/webaudio that you didn't like or:
#if ENABLE(WEB_AUDIO)

Alternatives might be:

WebCore/audio   orWebCore/audiocontext
#if ENABLE(AUDIO_CONTEXT) or  #if ENABLE(AUDIO_API)

I'm assuming that WebCore/platform/audio we can all agree on...

Chris


On Tue, Aug 24, 2010 at 5:20 PM, Simon Fraser simon.fra...@apple.comwrote:

 On Aug 24, 2010, at 4:47 PM, Chris Rogers wrote:

 Hi Simon, thanks for helping here.

 By the way, anybody who is interested can look at the files in:
 https://svn.webkit.org/repository/webkit/branches/audio/WebCore/audio/

 There are actually a number of audio files which could be considered
 re-usable, although I know of nothing at this time other than the web audio
 API which would use them:

 AudioBus.cpp
 Biquad.cpp
 Reverb.cpp
 FFTConvolver.cpp (and other FFT-related files)

 possibly also in this category are:
 Cone.cpp
 Distance.cpp
 MidSide.cpp
 SinWave.cpp
 Panner.cpp (and subclasses)
 (maybe a few others I've missed)

 Basically, these are the lowest-level building blocks which the
 higher-level parts (such as AudioContext, and AudioNode) use.  These
 lowest-level building blocks do not (or should not) have any dependencies to
 the higher-level code which implement the actual API (and have IDL files).
  They also don't have any dependencies on other parts of WebCore, although
 they do use stuff in wtf.

 So are you suggesting:

 WebCore/webaudio--- IDL files and API implementation
 WebCore/platform/audio--- lower-level building blocks such as
 AudioBus.cpp


 Fine by me!


 then with the conditional

 #if WEB_AUDIO- I added an underscore here


 That would be #if ENABLE(WEB_AUDIO) in the code.

 Simon


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Web Audio API

2010-08-24 Thread Chris Rogers
Ok, then it looks like we'll go with this for the directory names:

WebCore/webaudio
WebCore/platform/audio

And this for the feature define:
#if ENABLE(WEB_AUDIO)

Thanks everybody,
Chris

On Tue, Aug 24, 2010 at 9:02 PM, Darin Fisher da...@chromium.org wrote:

 On Tue, Aug 24, 2010 at 8:55 PM, Eric Carlson eric.carl...@apple.comwrote:


 On Aug 24, 2010, at 8:39 PM, Darin Fisher wrote:

 My objection (and it's only a slight one) was about using Web as a
 prefix for class names defined in WebCore.

 WebSockets is the main example of the Web prefix used in WebCore, and
 that's probably because sockets by itself would be too confusing.
  However, I have found the use of the Web prefix in WebCore to lead to some
 confusion by itself since WebKit layers tend to use the Web prefix for their
 classes/interfaces.

 I realize that the WebCore:: namespace makes this issue technically moot.
  I'm just concerned about it being confusing to have WebCore::WebFoo and
 WebKit API level WebFoo.

   I don't think there is any plan to give the *class* name a Web prefix,
 we are just talking about the names of the WebKit folders and the compile
 flag.

 eric



 Ah, OK.  Thanks for clearing that up for me.  It seemed like things were
 headed toward Web* classes given the contents of this folder:
 http://trac.webkit.org/browser/trunk/WebCore/websockets
 -Darin






 On Tue, Aug 24, 2010 at 5:29 PM, Chris Rogers crog...@google.com wrote:

 Good, it looks like we're getting close.  So we've agreed to how the
 files should be split up, but Darin Fisher still was concerned about the
 'web' prefix.

 Darin, was it the directory name WebCore/webaudio that you didn't like
 or:
 #if ENABLE(WEB_AUDIO)

 Alternatives might be:

 WebCore/audio   orWebCore/audiocontext
 #if ENABLE(AUDIO_CONTEXT) or  #if ENABLE(AUDIO_API)

 I'm assuming that WebCore/platform/audio we can all agree on...

 Chris


 On Tue, Aug 24, 2010 at 5:20 PM, Simon Fraser simon.fra...@apple.comwrote:

 On Aug 24, 2010, at 4:47 PM, Chris Rogers wrote:

 Hi Simon, thanks for helping here.

 By the way, anybody who is interested can look at the files in:
 https://svn.webkit.org/repository/webkit/branches/audio/WebCore/audio/

 There are actually a number of audio files which could be considered
 re-usable, although I know of nothing at this time other than the web audio
 API which would use them:

 AudioBus.cpp
 Biquad.cpp
 Reverb.cpp
 FFTConvolver.cpp (and other FFT-related files)

 possibly also in this category are:
 Cone.cpp
 Distance.cpp
  MidSide.cpp
 SinWave.cpp
 Panner.cpp (and subclasses)
 (maybe a few others I've missed)

 Basically, these are the lowest-level building blocks which the
 higher-level parts (such as AudioContext, and AudioNode) use.  These
 lowest-level building blocks do not (or should not) have any dependencies 
 to
 the higher-level code which implement the actual API (and have IDL files).
  They also don't have any dependencies on other parts of WebCore, although
 they do use stuff in wtf.

 So are you suggesting:

 WebCore/webaudio--- IDL files and API implementation
 WebCore/platform/audio--- lower-level building blocks such as
 AudioBus.cpp


 Fine by me!


 then with the conditional

 #if WEB_AUDIO- I added an underscore here


 That would be #if ENABLE(WEB_AUDIO) in the code.

 Simon



 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev




___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] Audio directory layout

2010-03-30 Thread Chris Rogers
I'm interested in people's opinions on where I should put my audio code in
WebKit.

Up to this point I've been assuming I would put my code into:

WebCore/platform/audio

But, on further reflection I realize that the majority of source files are
cross-platform engine code so perhaps it would make more sense to put the
cross-platform parts into:

WebCore/audio

and the platform-specific parts into:

WebCore/platform/audio

Does this seem reasonable?

Cheers,
Chris
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Audio directory layout

2010-03-30 Thread Chris Rogers
Thanks everyone for your answers.

Darin, I agree that audio might be confusing since HTMLAudioElement would
not be in there, but it still might be the simplest name.  Otherwise, how
about:

WebCore/audio-engine

or

WebCore/audio-processing

??

On Tue, Mar 30, 2010 at 5:31 PM, Darin Adler da...@apple.com wrote:

 On Mar 30, 2010, at 5:19 PM, Adam Barth wrote:

 The platform directory contains a lot more than just
 platform-specific files. It's the platform upon which WebCore is built. For
 example, KURL is in platform even though it's shared by all the ports. I
 think the main consideration for whether to put things in platform relate
 to the dependencies. For example, platform doesn't depend on the rest
 of WebCore.


 That’s right.

 The WebCore/platform directory’s name is a bit of a pun. It contains the 
 *platform
 abstraction*, exposing things present in the underlying operating system
 such as a way to find out about events and screen sizes and such, and also
 contains other basics that provide a *“platform”* for the rest
 of WebCore code, without dependencies on that code. It can be though of as a
 largely-separate lower level module within WebCore.

 The platform directory *does not* contain all platform-specific files. Nor
 should it. Directories such as WebCore/loader and WebCore/plugins contain
 platform-specific subdirectories as needed.

 As for whether audio should be a top-level concept, that might make sense.
 It seems similar to notifications and storage, which are top-level concepts.


 Code that is specifically about how the web models audio, and not how that
 is integrated with the underlying audio capabilities of the OS, more
 naturally would go somewhere outside the platform directory. A top level
 directory named audio might be confusing since HTMLAudioElement would not be
 in there.

 -- Darin


___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Audio directory layout

2010-03-30 Thread Chris Rogers
Most of my files so far use wtf stuff (OwnPtr, RefPtr, and Vector) so I
guess there are dependencies.

On Tue, Mar 30, 2010 at 5:55 PM, Adam Barth aba...@webkit.org wrote:

 Will there be dependencies on the rest of WebCore?  If not,
 platform/audio might sense as a peer to platform/graphics.

 Adam


 On Tue, Mar 30, 2010 at 5:45 PM, Chris Rogers crog...@google.com wrote:
  Thanks everyone for your answers.
  Darin, I agree that audio might be confusing since HTMLAudioElement
 would
  not be in there, but it still might be the simplest name.  Otherwise, how
  about:
  WebCore/audio-engine
  or
  WebCore/audio-processing
  ??
 
  On Tue, Mar 30, 2010 at 5:31 PM, Darin Adler da...@apple.com wrote:
 
  On Mar 30, 2010, at 5:19 PM, Adam Barth wrote:
 
  The platform directory contains a lot more than just
  platform-specific files. It's the platform upon which WebCore is built.
 For
  example, KURL is in platform even though it's shared by all the ports. I
  think the main consideration for whether to put things in platform
 relate
  to the dependencies. For example, platform doesn't depend on the rest
  of WebCore.
 
  That’s right.
  The WebCore/platform directory’s name is a bit of a pun. It contains the
  platform abstraction, exposing things present in the underlying
 operating
  system such as a way to find out about events and screen sizes and such,
 and
  also contains other basics that provide a “platform” for the rest
  of WebCore code, without dependencies on that code. It can be though of
 as a
  largely-separate lower level module within WebCore.
  The platform directory does not contain all platform-specific files. Nor
  should it. Directories such as WebCore/loader and WebCore/plugins
 contain
  platform-specific subdirectories as needed.
 
  As for whether audio should be a top-level concept, that might
 make sense.
  It seems similar to notifications and storage, which are top-level
 concepts.
 
  Code that is specifically about how the web models audio, and not how
 that
  is integrated with the underlying audio capabilities of the OS, more
  naturally would go somewhere outside the platform directory. A top level
  directory named audio might be confusing since HTMLAudioElement would
 not be
  in there.
  -- Darin
 
 

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Audio directory layout

2010-03-30 Thread Chris Rogers
Oh, I guess wtf doesn't really count as part of WebCore, so then I guess
not...

On Tue, Mar 30, 2010 at 5:57 PM, Chris Rogers crog...@google.com wrote:

 Most of my files so far use wtf stuff (OwnPtr, RefPtr, and Vector) so I
 guess there are dependencies.


 On Tue, Mar 30, 2010 at 5:55 PM, Adam Barth aba...@webkit.org wrote:

 Will there be dependencies on the rest of WebCore?  If not,
 platform/audio might sense as a peer to platform/graphics.

 Adam


 On Tue, Mar 30, 2010 at 5:45 PM, Chris Rogers crog...@google.com wrote:
  Thanks everyone for your answers.
  Darin, I agree that audio might be confusing since HTMLAudioElement
 would
  not be in there, but it still might be the simplest name.  Otherwise,
 how
  about:
  WebCore/audio-engine
  or
  WebCore/audio-processing
  ??
 
  On Tue, Mar 30, 2010 at 5:31 PM, Darin Adler da...@apple.com wrote:
 
  On Mar 30, 2010, at 5:19 PM, Adam Barth wrote:
 
  The platform directory contains a lot more than just
  platform-specific files. It's the platform upon which WebCore is built.
 For
  example, KURL is in platform even though it's shared by all the ports.
 I
  think the main consideration for whether to put things in platform
 relate
  to the dependencies. For example, platform doesn't depend on the rest
  of WebCore.
 
  That’s right.
  The WebCore/platform directory’s name is a bit of a pun. It contains
 the
  platform abstraction, exposing things present in the underlying
 operating
  system such as a way to find out about events and screen sizes and
 such, and
  also contains other basics that provide a “platform” for the rest
  of WebCore code, without dependencies on that code. It can be though of
 as a
  largely-separate lower level module within WebCore.
  The platform directory does not contain all platform-specific files.
 Nor
  should it. Directories such as WebCore/loader and WebCore/plugins
 contain
  platform-specific subdirectories as needed.
 
  As for whether audio should be a top-level concept, that might
 make sense.
  It seems similar to notifications and storage, which are top-level
 concepts.
 
  Code that is specifically about how the web models audio, and not how
 that
  is integrated with the underlying audio capabilities of the OS, more
  naturally would go somewhere outside the platform directory. A top
 level
  directory named audio might be confusing since HTMLAudioElement would
 not be
  in there.
  -- Darin
 
 



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Audio directory layout

2010-03-30 Thread Chris Rogers
Still, I'll need a directory outside of platform for the IDL files and
associated .cpp/.h files.  And the .cpp/.h files really feel like they ought
to live in the same directory as the other cross-platform files, because of
the way the dependencies work there.  That seems to point to keeping most of
these files outside of the platform directory.


On Tue, Mar 30, 2010 at 5:58 PM, Chris Rogers crog...@google.com wrote:

 Oh, I guess wtf doesn't really count as part of WebCore, so then I guess
 not...


 On Tue, Mar 30, 2010 at 5:57 PM, Chris Rogers crog...@google.com wrote:

 Most of my files so far use wtf stuff (OwnPtr, RefPtr, and Vector) so I
 guess there are dependencies.


 On Tue, Mar 30, 2010 at 5:55 PM, Adam Barth aba...@webkit.org wrote:

 Will there be dependencies on the rest of WebCore?  If not,
 platform/audio might sense as a peer to platform/graphics.

 Adam


 On Tue, Mar 30, 2010 at 5:45 PM, Chris Rogers crog...@google.com
 wrote:
  Thanks everyone for your answers.
  Darin, I agree that audio might be confusing since HTMLAudioElement
 would
  not be in there, but it still might be the simplest name.  Otherwise,
 how
  about:
  WebCore/audio-engine
  or
  WebCore/audio-processing
  ??
 
  On Tue, Mar 30, 2010 at 5:31 PM, Darin Adler da...@apple.com wrote:
 
  On Mar 30, 2010, at 5:19 PM, Adam Barth wrote:
 
  The platform directory contains a lot more than just
  platform-specific files. It's the platform upon which WebCore is
 built. For
  example, KURL is in platform even though it's shared by all the ports.
 I
  think the main consideration for whether to put things in platform
 relate
  to the dependencies. For example, platform doesn't depend on the rest
  of WebCore.
 
  That’s right.
  The WebCore/platform directory’s name is a bit of a pun. It contains
 the
  platform abstraction, exposing things present in the underlying
 operating
  system such as a way to find out about events and screen sizes and
 such, and
  also contains other basics that provide a “platform” for the rest
  of WebCore code, without dependencies on that code. It can be though
 of as a
  largely-separate lower level module within WebCore.
  The platform directory does not contain all platform-specific files.
 Nor
  should it. Directories such as WebCore/loader and WebCore/plugins
 contain
  platform-specific subdirectories as needed.
 
  As for whether audio should be a top-level concept, that might
 make sense.
  It seems similar to notifications and storage, which are top-level
 concepts.
 
  Code that is specifically about how the web models audio, and not how
 that
  is integrated with the underlying audio capabilities of the OS, more
  naturally would go somewhere outside the platform directory. A top
 level
  directory named audio might be confusing since HTMLAudioElement would
 not be
  in there.
  -- Darin
 
 




___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] Audio directory layout

2010-03-30 Thread Chris Rogers
Maciej proposed that I work in my own special branch of WebKit.  Maybe it
makes sense for me to check all my code (unreviewed) into my special
WebKit branch using a provisional directory layout.  Then people can see how
all the code works together and even build it, run my demos, etc.  Then
people may have a better feeling for any changes I may need to make before
getting the code checked into trunk.

On Tue, Mar 30, 2010 at 6:44 PM, Adam Barth aba...@webkit.org wrote:

 It sounds like you're thinking about the audio feature as vertically
 integrated instead of thinking about how its different layers related
 to the other layering decisions in WebCore.  (This is, of course,
 without understanding the code at all, so I might be wildly off base.)

 Adam


 On Tue, Mar 30, 2010 at 6:04 PM, Chris Rogers crog...@google.com wrote:
  Still, I'll need a directory outside of platform for the IDL files and
  associated .cpp/.h files.  And the .cpp/.h files really feel like they
 ought
  to live in the same directory as the other cross-platform files, because
 of
  the way the dependencies work there.  That seems to point to keeping most
 of
  these files outside of the platform directory.
 
  On Tue, Mar 30, 2010 at 5:58 PM, Chris Rogers crog...@google.com
 wrote:
 
  Oh, I guess wtf doesn't really count as part of WebCore, so then I guess
  not...
 
  On Tue, Mar 30, 2010 at 5:57 PM, Chris Rogers crog...@google.com
 wrote:
 
  Most of my files so far use wtf stuff (OwnPtr, RefPtr, and Vector) so I
  guess there are dependencies.
 
  On Tue, Mar 30, 2010 at 5:55 PM, Adam Barth aba...@webkit.org wrote:
 
  Will there be dependencies on the rest of WebCore?  If not,
  platform/audio might sense as a peer to platform/graphics.
 
  Adam
 
 
  On Tue, Mar 30, 2010 at 5:45 PM, Chris Rogers crog...@google.com
  wrote:
   Thanks everyone for your answers.
   Darin, I agree that audio might be confusing
 since HTMLAudioElement
   would
   not be in there, but it still might be the simplest name.
  Otherwise,
   how
   about:
   WebCore/audio-engine
   or
   WebCore/audio-processing
   ??
  
   On Tue, Mar 30, 2010 at 5:31 PM, Darin Adler da...@apple.com
 wrote:
  
   On Mar 30, 2010, at 5:19 PM, Adam Barth wrote:
  
   The platform directory contains a lot more than just
   platform-specific files. It's the platform upon which WebCore is
   built. For
   example, KURL is in platform even though it's shared by all the
   ports. I
   think the main consideration for whether to put things in platform
   relate
   to the dependencies. For example, platform doesn't depend on the
 rest
   of WebCore.
  
   That’s right.
   The WebCore/platform directory’s name is a bit of a pun. It
 contains
   the
   platform abstraction, exposing things present in the underlying
   operating
   system such as a way to find out about events and screen sizes and
   such, and
   also contains other basics that provide a “platform” for the rest
   of WebCore code, without dependencies on that code. It can be
 though
   of as a
   largely-separate lower level module within WebCore.
   The platform directory does not contain all platform-specific
 files.
   Nor
   should it. Directories such as WebCore/loader and WebCore/plugins
   contain
   platform-specific subdirectories as needed.
  
   As for whether audio should be a top-level concept, that might
   make sense.
   It seems similar to notifications and storage, which are top-level
   concepts.
  
   Code that is specifically about how the web models audio, and not
 how
   that
   is integrated with the underlying audio capabilities of the OS,
 more
   naturally would go somewhere outside the platform directory. A top
   level
   directory named audio might be confusing since HTMLAudioElement
 would
   not be
   in there.
   -- Darin
  
  
 
 
 
 

___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] std::complex affects isinf(), etc.

2010-02-03 Thread Chris Rogers
I initially put in a patch for a class for Complex numbers, but people
preferred that I just use the std::complex version.

In the process of switching my code over to use std::complex I noticed a
conflict with isinf(), isnan(), etc.
The problem is that simply including:

#include complex

breaks the isinf(), isnan() functions (and some others I think).  So now I'm
getting compile errors in any header files
which use these functions, such as WebGLFloatArray.h (which I need to
include for music visualizer stuff).
I'm a bit queasy about all the side-effects of simply including complex
and am not even sure how to address the
current situation, short of switching all of webkit over to using
std::isinf, std::isnan, etc.

Now I remember having similar problems with this in other codebases I've
worked on, as the effects of complex seem
to be viral...

Anybody have any recommendations?

Thanks,
Chris
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


Re: [webkit-dev] std::complex affects isinf(), etc.

2010-02-03 Thread Chris Rogers
Basically, if you include complex then it undefines the functions (or
macros) for isinf(), isnan(), and others, and then expects you
to use std::isinf(), std::isnan() instead.  We use these functions in a
number of places, so we'd need to figure out a reasonable solution.
Or just go with my original class.

On Wed, Feb 3, 2010 at 5:01 PM, Sam Weinig sam.wei...@gmail.com wrote:

 What specific errors are you getting? I don't understand why including a
 standard header would break other standard functions.

 -Sam

 On Wed, Feb 3, 2010 at 3:22 PM, Chris Rogers crog...@google.com wrote:

 I initially put in a patch for a class for Complex numbers, but people
 preferred that I just use the std::complex version.

 In the process of switching my code over to use std::complex I noticed a
 conflict with isinf(), isnan(), etc.
 The problem is that simply including:

 #include complex

 breaks the isinf(), isnan() functions (and some others I think).  So now
 I'm getting compile errors in any header files
 which use these functions, such as WebGLFloatArray.h (which I need to
 include for music visualizer stuff).
 I'm a bit queasy about all the side-effects of simply including complex
 and am not even sure how to address the
 current situation, short of switching all of webkit over to using
 std::isinf, std::isnan, etc.

 Now I remember having similar problems with this in other codebases I've
 worked on, as the effects of complex seem
 to be viral...

 Anybody have any recommendations?

 Thanks,
 Chris


 ___
 webkit-dev mailing list
 webkit-dev@lists.webkit.org
 http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev



___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev


[webkit-dev] Heads up for audio changes

2010-02-01 Thread Chris Rogers
Hey guys,

Just so nobody will be surprised, over the next weeks I'm going to start
landing some new code for an audio engine.
This code will primarily live in a new directory WebCore/platform/audio and
will implement some new audio features such as:

* scheduled sound playback for sample-accurate musical applications
* spatialized audio such as what is found in OpenAL (source/listener-based,
distance effects, sound cones, doppler-shift, ...)
* a convolution engine for a wide range of linear effects, especially very
high-quality room effects (concert halls, etc.)
* realtime analysis / music visualizer support
* a modular effects architecture (still being fleshed out)
* granular effects

I've been talking with Eric Carlson, Simon Fraser, Chris Marrin, Dean
Jackson to work out some API issues.  And as soon as this all gels, I'll be
landing the IDL files.  In the meantime, there's quite a lot of fundamental
engine code which will not be affected by the API which I hope to land in
the near future.

Best Regards,
Chris Rogers
___
webkit-dev mailing list
webkit-dev@lists.webkit.org
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev