Hi..
I've built a cms that's running into a serious speedbump.
I load several sets of javascript through a custom built db
cachingcompression system.
Loading it without caching and compression takes between 2 and 3 seconds for
the entire app, 112Kb javascript.
The caching system first
that this is not correct, because if i remove the readfile
statement, things speed up to 120 milliseconds (from 3.37seconds)..
Some more help would be greatly appreciated..
On Wed, Jun 4, 2008 at 9:20 AM, Nathan Nobbe [EMAIL PROTECTED] wrote:
On Wed, Jun 4, 2008 at 3:05 AM, Rene Veerman [EMAIL
Using a statis .js file costs 3.54 seconds.
Should i suspect apache now?
it's equally slow when viewed from a local machine running
firefox+apache+php+mysql, or from the internet hoster i use..
On Wed, Jun 4, 2008 at 5:48 PM, Nathan Nobbe [EMAIL PROTECTED] wrote:
On Wed, Jun 4, 2008 at 8:49 AM, Rene Veerman [EMAIL PROTECTED] wrote:
Using a statis .js file costs
In my previous post, I listed 50Kb of obfusicated gzipped javascript loading
in 3.5 seconds.
It turns out that it was the decompression of the obfusication that was
causing the delay.
Twiddling with parameters for phpjso (the obfusicator) brought that down to
just 320ms.
But now i have sort of
Hi.
I want to enable video and flash for my CMS, but that requires that i can
read at least the dimensions of a video file..
ImageMagick's identify command supposedly reads AVI and MPEG, but i can't
get it to work on my avi's:
C:\Users\rene\Documents\Downloadsidentify
Hi.
I've built a photo-cms (http://mediabeez.ws) to which i now want to add
e-commerce support.
Adding product-definitions to photo's wasn't much of a problem, and adding a
shoppingcart wasn't either.
What i'm worried about is the accounting part of a webshop system.
I've downloaded some
Hi. Today, i've got a chicken-and-egg puzzle for your enjoyment :)
In order to properly support google indexing of published content hosted by
my CMS', which scales to browser-size, no matter what it is initially or how
the user resizes it.
Adding meta info for the many pictures hosted via my CMS
Hi.
For my CMS, i need to do imports of photos and videos.
For uploading large files i have a Java FTP applet with automated resume, so
no problems there.
But: unzipping them and importing them might take a long time if it's a
large batch (1000s of pictures/videos).
Currently i split this up by
I use the following script to make forward an ajax call from the browser
to a server other than my main webserver.
On my homeserver (debian/apache2/php5 with rootxs) it runs fine, but on
my shared hosting (servage.net, phpinfo at
http://mediabeez.veerman.ws/mb/sn.php) it freezes / hangs on
Nathan Rixham wrote:
do a phpinfo() and check if sockets are enabled.. lots of shared hosts
have them turned off
yep, sockets are enabled on the shared hoster.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
It's actually the server i'm calling that's at fault, it didn't serve
connections from the outside world..
So not a php thing at all, sry for the wasted time..
Rene Veerman wrote:
I use the following script to make forward an ajax call from the
browser to a server other than my main webserver
Hi, i have the following php statements;
I'm wondering why exec()'s $output remains empty..
Any help is greatly appreciated..
?php
$cmd = 'nice -n 19 ffmpeg -i '.$sourcePath.'';
exec ($cmd, $output, $result);
return array (
'cmd' = $cmd,
'output' = $output,
bingo, this fixed it :)
thx (all) for answering so quickly :)
Daniel Brown wrote:
Try this instead, just to make sure everything's running as expected:
?php
$cmd = 'nice -n 19 ffmpeg -i '.$sourcePath.' 21';
exec($cmd,$ret,$err);
print_r($ret);
?
$ret (or whatever you name the
I have other ffmpeg statements that i execute in the same manner. They
do produce the desired result-files, but also do _not_ have $output set
to the text i see when i run the commands from the commandline..
I'd like to get output from all my executions of ffmpeg, its usefull for
detailing
I have a script that uses curl to call a worker function on another server.
For small workloads, it works just fine.
But when my script processes a large zip-file and updates some status
files, curl_exec never returns the result data even though the called
script does send it.
Any ideas?
Robert Cummings wrote:
On Wed, 2008-10-01 at 17:31 +0200, Rene Veerman wrote:
I have a script that uses curl to call a worker function on another server.
For small workloads, it works just fine.
But when my script processes a large zip-file and updates some status
files, curl_exec never
hi, i'd like my app to send sms warnings of some events.
if you know of a free / cheap sms service that can be called from php,
please let me/us know.
u earn extra points if it can send to dutch phones / any phone in the
world ;)
--
PHP General Mailing List (http://www.php.net/)
To
thanks!
Stut wrote:
On 8 Oct 2008, at 20:33, Rene Veerman wrote:
hi, i'd like my app to send sms warnings of some events.
if you know of a free / cheap sms service that can be called from
php, please let me/us know.
u earn extra points if it can send to dutch phones / any phone
Apologies for posting a monthly/yearly recurring theme here..
If someone can add links to previous discussions relating to the same,
that could help too.
I'd like to use subversion on a home unix server of mine to keep track
of my projects.
I dont even know what i need for a good intergration
I use ffmpeg (unix commandline) to do the video converting..
ffmpeg needs to be properly re-compiled with mp3 support (for audio in
the flv files); theres tutorials on google on how to do that.
Are you on shared hosting? Most wont allow any kind of video conversion
on shared hosting.
I had
Daniel Brown wrote:
On Tue, Oct 21, 2008 at 4:26 AM, Brennon Bortz [EMAIL PROTECTED] wrote:
Actually, speaking as someone now living in the UK, your low end is LESS
than minimum wage here. Rather insulting, if you ask me...
Simple advice then: delete the message and don't reply.
http://mediabeez.ws/mediaBeez/permalink.php?tag=visCanvasLoaderGraphic
This the second opensource plugin that I release; a loader icon capable
of displaying colorfull yet semi-transparent animated graphics. It's
simple to use and design for..
I'll admit right now that there are still some
Hi.
I'm trying to build a new admin interface for my cms, in a single screen.
I was thinking to have a tree-view on the left side of that screen,
something like
+Site-Name
+ UserGroups and Users (node-type section)
+ Administrators (node-type usergroup)
- administrator (node-type
Thodoris wrote:
How are your groups linked with your users? Does every user has a
GroupId in his record?
Does this question has to do with building an html tree or it is about
the best database schema that helps to construct and retrieve a tree
faster?
My users-table is linked to the
Thodoris wrote:
How are your groups linked with your users? Does every user has a
GroupId in his record?
Does this question has to do with building an html tree or it is about
the best database schema that helps to construct and retrieve a tree
faster?
oh, and while searching i landed
Friend of mine wrote this article that might be of interest to you:
Often there is advertizing code to be implemented in a page, and there
are 2 problems one may face:
(1) the website hangs due to a lag on the code delivering server
(2) you normally cannot lazy load the script since
Richard Heyes wrote:
Commit Early Commit Often. :P
That's for wimps... :-)
and people who hate spending much time retyping last week's work...
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Eric Butera wrote:
I cheat and just keep the normal parentId column and regenerate the
tree based on changes on that. I had spent a little bit looking at
the different update/delete methods and there wasn't a lot of good
information/examples as you've stated. I had found some but there
were
Rene Veerman wrote:
Eric Butera wrote:
I cheat and just keep the normal parentId column and regenerate the
tree based on changes on that. I had spent a little bit looking at
the different update/delete methods and there wasn't a lot of good
information/examples as you've stated. I had found
that java app re-arrange large parts of the tree, and i
wonder if it's still the same tree i'm looking at :)
--
--
Rene Veerman, creator of web2.5 CMS http://mediabeez.ws/
--
PHP General Mailing List (http://www.php.net
Rene Veerman wrote:
i'm still trying to get my head around how the operations are done.
sometimes i see that java app re-arrange large parts of the tree, and
i wonder if it's still the same tree i'm looking at :)
omg, somebody grab the LART and give me a good spanking on the back of
my head
..
--
--
Rene Veerman, creator of web2.5 CMS http://mediabeez.ws/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
will be reduced to a minimum.
--
--
Rene Veerman, creator of web2.5 CMS http://mediabeez.ws/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
..
--
--
Rene Veerman, creator of web2.5 CMS http://mediabeez.ws/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Rene Veerman wrote:
hi, i need to search with regexps in strings, and want to know the
index (of the source string) at which the regexp matched. and the
length of the matched string, so with substitutions like \d+..
i'm kinda in a hurry on this one, would appreciate your immediate
answer
i'm getting freezes for the 3rd to Nth concurrent request on my
homeserver (got root, on debian4 + apache2).
how can i allow more threads? like 50 or so?
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Jochem Maas wrote:
Rene Veerman schreef:
i'm getting freezes for the 3rd to Nth concurrent request on my
homeserver (got root, on debian4 + apache2).
how can i allow more threads? like 50 or so?
probably need to fix the apache.conf to allow more concurrent child processes.
K
Jochem Maas wrote:
Rene Veerman schreef:
Jochem Maas wrote:
Rene Veerman schreef:
i'm getting freezes for the 3rd to Nth concurrent request on my
homeserver (got root, on debian4 + apache2).
how can i allow more threads? like 50 or so?
probably need to fix
Gal Gur-Arie wrote:
Rene Veerman wrote:
i'm getting freezes for the 3rd to Nth concurrent request on my
homeserver (got root, on debian4 + apache2).
how can i allow more threads? like 50 or so?
Any chance that you're using session and checking it from the same
browser from different
Hi, i got ffmpeg to convert videos for my CMS to Flash-video.
With debian, it was real easy to set up.
The only drawback is that if i can't advance the progressbar during the
call to ffmpeg..
If i encode longer videos, it can take up to an hour each, and to halt
the progressbar for that long
continually updates)
since it doesnt print the number of input frames, i can't calculate
progress from frame-count in the last line.
not even when i put ffmpeg -v verbose, do i get the frame count..
Ashley Sheridan wrote:
On Wed, 2008-12-10 at 22:13 +0100, Rene Veerman wrote:
Hi, i got ffmpeg
Boyd, Todd M. wrote:
Top posting BAD! Hulk SMASH!
Anyway, moving on...
I believe ffmpeg -i filename.ext should give you frame count
information (along with a bunch of other stuff). It will have to be
parsed, of course, but... meh. Also--were you aware that there is an
ffmpeg PHP extension?
Colin Guthrie wrote:
'Twas brillig, and Rene Veerman at 10/12/08 23:03 did gyre and gimble:
Well, nowhere can i find the frame count being printed, but there
_is_ a duration: hh:mm:ss:ms field outputted, and the updating line
displays a time=seconds.ms (the time in the movie where the encoder
German Geek wrote:
On Thu, Dec 11, 2008 at 1:53 PM, Rene Veerman [EMAIL PROTECTED]
mailto:[EMAIL PROTECTED] wrote:
Colin Guthrie wrote:
'Twas brillig, and Rene Veerman at 10/12/08 23:03 did gyre and
gimble:
Well, nowhere can i find the frame count being
Hi..
I have created a setup between my shared hoster and my home debian box,
where the shared hosting accepts video uploads, and forwards them to the
home server for video-conversion (which isn't allowed on shared hosting).
In order to kick off the import process, I need the shared hoster to
Unfortunately neither ping or traceroute is installed on the shared
hoster, can't call 'm..
Nathan Nobbe wrote:
On Wed, Jan 7, 2009 at 11:21 AM, Rene Veerman rene7...@gmail.com
mailto:rene7...@gmail.com wrote:
Hi..
I have created a setup between my shared hoster and my home debian
Hi, i'm doing a forward to another server (somehost.com/2.php) by using
header(Location: http://someserver.com/script/1.php;);
But on neither servers do i get hits in the apache access log for either
script being called.
Any ideas?
--
PHP General Mailing List (http://www.php.net/)
To
Andrew Ballard wrote:
I'll take the simple answer and say neither script IS being called?
OK, I'll go back to my corner now.
Andrew
[Geesh - I take the time to send out a sarcastic message and forget to
send it to the list.]
The scripts are called, because i can see the data coming back
Hi,
I've been stuck on this problem i'm having after i re-installed my linux
(debian) machine last month..
I'm building a CMS with video import capabilities, but since it runs on
shared hosting i need to make a cURL POST call to a URL on my home machine,
which does video-conversion.
This curl
OOPS :)
As a second test, i changed the test-url to
http://82.170.249.144:81/mediaBeez/sn.phphttp://82.170.249.144/mediaBeez/sn.php
On Tue, Jan 20, 2009 at 7:33 PM, Rene Veerman rene7...@gmail.com wrote:
Hi,
I've been stuck on this problem i'm having after i re-installed my linux
(debian
, );
curl_setopt($ch, CURLOPT_PORT, 81);
#grab URL and pass it to the browser
curl_exec($ch);
$error = curl_error($ch);
#close cURL resource, and free up system resources
curl_close($ch);
echo $error;
//phpinfo();
?
On Tue, Jan 20, 2009 at 7:35 PM, Rene Veerman rene7...@gmail.com wrote:
OOPS
Hi, I'm cross-posting this (from jquery-en js mailinglist) because it's
something that i think is relevant for this list too..
You can ignore the jQuery in it, since all the jquery calls can be
replaced with document.getElementById().
I have secured the login form for my CMS with a
Al wrote:
I'm scripting a light-weight, low volume signup registry for a running
club. Folks sign up to volunteer for events and the like. There will
generally be a handful of signup registries at any one time. A typical
registry will only contain 50 to 100 names. Each registry is only in
Just for this case, where authentication of the server isn't an issue,
and things like deployment cost are,
i'd like to propose that we on this list look again at securing
login/pass through onewayHash functions, in an otherwise non-ssl
environment.
i hate to be a critic of the community
Hi,
I would like the opinion of the readers of this list on whether or not they
agree on the usefullness of adding some new functions to the core of PHP.
Background Info:
I want more debug-information from my scripts.
And I want to perform lengthy operations in a more robust way.
And I want to
already thought of a small improvement:
function traceHandler (
$file = string;fullpath,
$lineNumber = integer,
$functionName = string,
$eventIsStartOfFunction=boolean, // false = being called at exit of
the function
$arguments = array(
On Fri, Dec 25, 2009 at 11:26 AM, Andy Shellam andy-li...@networkmail.euwrote:
Hi,
Have you taken a look at Xdebug - http://xdebug.org/ ?
From the manual: Xdebug allows you to log all function calls, including
parameters and return values to a file in different formats.
Would this do what
+1 from Amsterdam :)
a happy, productive profitable new year to all a ya.
On Fri, Dec 25, 2009 at 3:16 PM, Shawn McKenzie nos...@mckenzies.netwrote:
Merry Christmas from Texas, USA!
On Fri, Dec 25, 2009 at 9:50 PM, Kim Madsen php@emax.dk wrote:
Copenhagen sents kind regards too, may you all have some swell days.
And C U at Queensday? :o)
Alas, Queensday is not what it used to be.
Ever since they (our fun-hating mayor Cohen and his cronies) closed down the
night
I was dealing with large deep arrays in PHP and wanted a better way to
view such data structures in the browser.
So i've built a few functions that show such structures initially collapsed,
with various options to click-and-see what's in a sub-value.
It can also handle HTML within JSON, and JSON
I'm working on a better var_dump (http://mediabeez.ws/htmlMicroscope/,
LGPL), and want to launch my kate editor when i click in the browser on a
line in my trace-log.
I'm trying to exec() this line, but it returns 1 (which is i believe a
general error)
echo hhh | sudo -u rene -S /bin/sh -c
r...@ekster:~$ uname -a
Linux ekster 2.6.31-17-generic #54-Ubuntu SMP Thu Dec 10 16:20:31 UTC 2009
i686 GNU/Linux
r...@ekster:~$ apache2 -v
Server version: Apache/2.2.12 (Ubuntu)
Server built: Nov 12 2009 22:49:46
r...@ekster:~$ php -v
PHP 5.2.10-2ubuntu6.3 with Suhosin-Patch 0.9.7 (cli)
Thanks for the reply..
I only need this to work locally on the web-server for now..
So i'm calling a script through ajax routines, which would do the exec().
Since it's on the local webserver, that should work, right?
-- Forwarded message --
From: Rene Veerman rene7...@gmail.com
Date: Fri, Jan 8, 2010 at 1:58 PM
Subject: Re: [PHP] trying to launch kate from the browser
To: a...@ashleysheridan.co.uk
Yep, i also just thought of using ssh/ftp to remotely edit files.
I can probably configure
oh,
echo blah | sudo -u rene -S /bin/sh -c export HOME=/home/rene/
exec($str,$o,$r);
$r === 0.
so that works.
therefore, it must be kate itself that refuses to start up from apache's
context.
too bad $o === empty array.
any ideas are most welcome.
$str = ''echo blah | sudo -u rene -S /bin/sh -c export HOME=/home/rene/
export'';
exec($str,$o,$r);
$r === 0.
$o =
pre class='xdebug-var-dump' dir='ltr'
barray/b
0 = 'export HOME=apos;/home/rene/apos;'
.more.
/pre
-- Forwarded message --
From: Rene Veerman rene7...@gmail.com
Date: Fri, Jan 8, 2010 at 2:29 PM
Subject: Re: [PHP] trying to launch kate from the browser
To: Bob McConnell r...@cbord.com
A: str_replace() ;) maybe a preg_replace() but i dont think thats even
necessary.
B
hmm. after a nap i'm gonna try to start the editor directly from the browser
instead, with ssh:// hopefully to get/write the file on the server.
demo free download @ http://mediabeez.ws/htmlMicroscope/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Check it out at http://mediabeez.ws/htmlMicroscope
Last modified Mon, 11 Jan 2010, 13:00 UTC+0100
Latest changes:
* Javascript hm() support added
* Added ability to view HTML strings as rendered HTML.
I admit there are still problems with some HTML taking over the
To do what you want that new httpd server should at least be able to
call up PHP via cli / whatever, and retrieve the output.
It also needs to provide what's in php called $_GET and $_POST.
Assuming you got that covered, then yes, you could route the calls via
ajax (i recommend jquery.com for
oh, and if you're going to use ajax-non-phphttpd-php-andback, then
check if your dear non-php httpd abuses the CPU while waiting for PHP
to return the results...
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
ok, you might wanna re-ask on an apache list in that case..
On Wed, Jan 20, 2010 at 6:48 AM, Camilo Sperberg unrea...@gmail.com wrote:
On Wed, Jan 20, 2010 at 02:33, Rene Veerman rene7...@gmail.com wrote:
if (isset($_SERVER['HTTP_IF_MODIFIED_SINCE']) AND
strtotime($_SERVER
for a small(ish) dataset, or
- the client generates data (also to be translated to html) that the
server doesnt really need to know about (yet)
js can really take some stress off the server.
On Wed, Jan 20, 2010 at 9:31 AM, Michael A. Peters mpet...@mac.com wrote:
Rene Veerman wrote:
That's how
Hi, for http://mediabeez.ws/htmlMicroscope/ (lgpl) i need to make
large complex but _randomly filled_ test-arrays.
The code i wrote (hastily) for this is taking over 2 hours to generate
65k array values.
I wonder if any of you see a way to improve it's speed.
global $hmGenKeys;
$hmGenKeys = 0;
this is at least 1000% faster
the crux is that array()+=array() is way faster than any array_merge()
operations.
global $hmGenKeys;
$hmGenKeys = 0;
function htmlMicroscope_generateRandomArray ($maxKeys, $maxDepth,
$maxDuration=-1) {
global $hmGenKeys;
if ($maxKeys!==null) {
$hmGenKeys =
I usually roll my own, unless there's a free lib / cms that does the
trick near-perfectly, and is well-written (so extensible)
About 80-90% of my tasks require me to use one of my own frameworks (i
have 2, one simple and one with many graphical gimmicks), and i
re-use / improve the mid-level
If you want m-inheritance, you can include (encapsulate is the word
i think) your smaller classes in midware and big top-level classes
that expose (part of) their interfaces. It's easy.
But guard against creating too many dependencies between different
smaller classes to and bigger classes.
Oh, and i'd allow 1 (or _maybe_ 2) very big super-class(es) at the
top level of a framework / cms, that do include 50-100 smaller
classes.
midware classes can evolve (be extracted) from the superclass, as
your app evolves.
Try to keep groups of functions relating to as few smaller/lower
classes
Have you tried letting the php script output \r\n instead of just
\n as newline ?
On Wed, Jan 27, 2010 at 10:46 PM, Alexandre Simon lexsi...@gmail.com wrote:
Hello,
I'm pretty sure (in realty I do not understand a lot about the problem...
:( ) this is a distribution or a version problem but
I'd like to add that when dealing with large memory structures,
usually arrays, combining them is fastest when done like this:
$array1 += $array2;
This will not always produce correct results when dealing with arrays
that contain identical keys, but for non-overlapping arrays it is far
faster
Allen, i think your primary design consideration is this;
- Need to be able to build a large variety of reports based on existing data
This would lead me to jump at a 'plugin' solution.
I'd make:
- 1 superclass as interface to the rest of your webapp(s)
- 1 top-class that has the
It's Rene, not Renee :p
curl is a method of fetching http pages from within php (and other languages).
with parsing i meant parse(process) a html page into (in my case) an
array of hits found on that page.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit:
changed about 7-10 functions so i dont know which one was the pig..
On Thu, Jan 28, 2010 at 12:41 PM, Ford, Mike m.f...@leedsmet.ac.uk wrote:
-Original Message-
From: Rene Veerman [mailto:rene7...@gmail.com]
Sent: 27 January 2010 22:46
And if your script needs to pass large ( 5Mb) arrays
Hi..
I've built http://mediabeez.ws/htmlMicroscope/ (lgpl), which provides
a way to look at very big arrays in the browser.
I'm kinda stuck at a 120 - 200Mb data-size limit, and have exhausted
all my ideas on how to increase that limit further.
My reasoning is that i have a gigabyte free memory
On Thu, Jan 28, 2010 at 4:04 PM, Robert Cummings rob...@interjinn.com wrote:
Use get_memory() to see just how much memory you're using. Somewhere in your
script you are probably storing much more memory than you think.
My functions do print the memory usage;
I'm just wondering why the array
Oh, i forgot to mention that firefox takes about a gigabyte of memory
after having stalled at 200mb parsed in a 330mb document..
And despite using setTimeout(), firefox frequently freezes (for about
2 to 10 minutes), before updating the decoding-status display again.
I'd really appreciate
At 200Mb/330Mb parsing, i have released 200Mb of html comment nodes,
and should have accumulated only 200Mb of javascript array/object.
it's _just_ the data, no HTML has been generated yet.
I accept a 5x overhead for turning it into HTML, but wonder why
firefox
a) stops updating the screen despite
On Thu, Jan 28, 2010 at 5:32 PM, Ashley Sheridan
a...@ashleysheridan.co.ukwrote:
You could page through the data and make it look like it's happening all in
the browser with a bit of clever ajax
Ok, good point.
Maybe JSON-transport javascript parsing just has it's limit at just over
100
On Thu, Jan 28, 2010 at 12:31 AM, clanc...@cybec.com.au wrote:
On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
Opening tables, etc, wrongly generally messes the page up completely, but
forgetting to close them again often has no affect no visible effect at all
-- until
On Thu, Jan 28, 2010 at 10:17 PM, clanc...@cybec.com.au wrote:
On Thu, 28 Jan 2010 21:10:42 +0100, rene7...@gmail.com (Rene Veerman) wrote:
On Thu, Jan 28, 2010 at 12:31 AM, clanc...@cybec.com.au wrote:
On Wed, 27 Jan 2010 10:21:00 -0800, deal...@gmail.com (dealtek) wrote:
Opening tables, etc
other idea: iframes, and plain old links.
On Fri, Jan 29, 2010 at 10:32 PM, Rene Veerman rene7...@gmail.com wrote:
flash(develop.org) perhaps? or does that also fall under their
no-active-x policy?
On Fri, Jan 29, 2010 at 9:17 PM, Haig Davis level...@gmail.com wrote:
Good Day All
http://mediabeez.ws/htmlMicroscope/
I'll b cleaning up releasing the 1.3.0 code today / early next week..
On Sat, Jan 30, 2010 at 12:54 AM, Daevid Vincent dae...@daevid.com wrote:
I'm wondering if anyone has a PHP debug-type routine that will take a PHP
array and output it to the web page,
I've just wasted a few hours by trying to find a bug in my code that
messed up my JSON-passed-on-$_GET.
I'm using fopen() so please no nagging about putting JSON in $_POST..
I finally found the answer; in my distro's /etc/php5/apache2/php.ini,
magic_quotes_gpc is ON.
I'd like to know why, since
itsa homeserver, so i recon it was ubuntu.. buncha lamers :)
On Mon, Feb 1, 2010 at 7:51 PM, Ashley Sheridan
a...@ashleysheridan.co.ukwrote:
On Mon, 2010-02-01 at 19:44 +0100, Rene Veerman wrote:
I've just wasted a few hours by trying to find a bug in my code that
messed up my JSON-passed
Is the client not receptive to that explanation?
On Mon, Feb 1, 2010 at 10:25 PM, Skip Evans s...@bigskypenguin.com wrote:
Ashley Sheridan wrote:
Why spend ages reinventing the wheel?
I've come across so many clients using third party carts who have been
unhappy with the final results. I
yea, try executing the sql statement
set global max_allowed_packet = 500 * 1024 * 1024;
from php?? (note; it sets it to 500mb)
not sure if your mysql server will allow this.
on shared hosting, you can expect they disabled the ability to change
it from php..
On Mon, Feb 1, 2010 at 10:27 PM,
i'm a fan of adodb.sf.net, which i've used with both postgresql and mysql.
On Tue, Feb 2, 2010 at 9:23 PM, Lars Nielsen l...@mit-web.dk wrote:
Hi List
I am trying to make a Database Abstraction Layer so I can which the DB
of my application between MySQL and Postgresql. I have been looking at
oh, on using adodb.sf.net and 0-overhead for jumping between mysql and
postgresql;
keep all your queries to as simple early-standard sql as possible.
the auto_increment incompatibilities can be circumvented with a
relatively simple
function getMax($table, $field) {
in adodb, you'd loop through
1 - 100 of 260 matches
Mail list logo