Handling large volumes of data with Prosper202


IMO the default version of prosper202 isn’t really designed to handle large volume media buys. The latest version 1.7.2 does make it easier with a simple click to clear click data button but it can still prove difficult to pull off reports once the ammount of data you are analysising exceeds a certain limit. The dedi I use for Prosper is a twin Intel Xeon quad core with 24gb of ram and it still complains once the click database exceeds 4GB.

I was trying to analyze keywords from a particular traffic source
and I had to narrow the search to just a few hours otherwise I would get the error below:
Fatal error: Maximum execution time of 60 seconds exceeded in /var/www/vhosts/myhost.com/httpdocs/202-config/functions.php on line 16

This made it pretty much impossible to analyse the data so I modified the file in question and extended the timeout limit.

To do this I opened up /202-config/functions.php and added the following line just below the opening <?php tag


It’s still a slow and painful process waiting for the report to run but slow and having the data all in one place is a much better comprimise than having to run reports in 3 hour blocks. Coffee break anyone?


Please enter your comment!
Please enter your name here