Would anyone care to put forward a solution to proactively tackle a similar script.
What I am specifically after, methods to know that this one computer ( keep it simple ) has sent x 1,000 requests in a short time ( ie to quick to be human ).
Before some of you lay the blame purely on AT&T for having poor code.
Other scenarios which are similar but different.
Perhaps we want to use this to throttle api requests, or to tackle a brute force attempt on the login.
A little more info, but not alot. More monitoring.
Equally discovered PHPIDS, reading how it works. Im not sure this would have picked up on this attack vector, as it would have been legitimate traffic. Just rapidly used.
But surely, as the "webapp" you can also revoke a users subscription.
I have been thinking about "recurring payment" with an agreed, "we the webapp" will cancel your paid access if you have been absent of the site "x days" or more likely 2 months.
Like many a member here, I am building something. In my case in PHP.
I have lovingly crafted ( wasted ) a system which is still in its infancy to handle a form of automated test driven development and from what I can see so far, the times I break it or discover a bug ( logic bug ) in my code later it is usually due to week tests.
Clearly I am not a large site like wordpress.com, a part of me thinks its the right thing to do. It certainly makes deploying a breeze and gives a little more "confidence" that the system didn't just fall over.
The sad part, at least in the PHP world, there are no libraries with an explanation on how best to use it with your site / application structure.
The database part is the biggest hiccup.
My solution is a duplicate database with no data.
First it confirms that the "fake db" matches the "real db" and warns if the table structure is different.
Then with my blank "fake db" using functions in my tests to "setup" and "destroy" I purposly build data to test with.
Once my site is operational, I will look to using live data in the "fake db" to simulate with real data. But so far it has been an interesting journey.
Clearly, TDD would become a bigger issue if you have to test sphinx/couchdb/mongodb etc setups but like with all creations its starts with a blank.php. ( in my case )
Not that this covers "Optimise for security".
But my test suite, after the first "test time" the test has been run, will warn if the "library","model", "controller" it is linked to has been altered without the "test" file changing, which at least warns me to the idea that maybe I need to refine my test.
All very padantic, but reading from the side lines of patio11, I can't help but see logic in "automating" things to make your life easier, more efficent and less likely to add human error into the equation.
I leave with what I consider a valid point.
Once upon a time, people looked at the "MVC" approach as time consuming and wasteful. At least from my understanding of watching the web evolve from the earlier days.
The downside of my excitement over this url is as of 28th April in the afternoon it is still claiming it will be released on the 27th. A shame, I was curious to see what and how they tested peoples cybersecurity skills.
Minusing the already growing snarkey comments, i too am at a loss, probably as i have never heard of them secondly their website forces me to stay in the iPhone version.
What am i missing?
Based on the blog post they mentioned igoogle ahead of games. Could it be linked to creating a more igoogle game marketplace.
Also couldnt help but notice how a game for igoogle would be constrained to small dimensions, just like those of a mobile.
I like to do my fair share of running, and found his comment on "marathon runners" having a suspiciously high cancer rate. Curious and wanting to know more...
Granted I have only skimmed over the following ( ie this stuff is hot off the internet press )
http://cebp.aacrjournals.org/content/17/1/183.full
This is a full text article discussing how regular exercise whici I consider running and marathon training to be part of, is useful in the fight against colon cancer, yet they are not 100% sure why.
I think the lack of citations is because it's a concentrate of his theories & discoveries. Individual articles (many of which are paywalled, sadly) seem to have better references. It'll be interesting to see how well referenced his book is.
All he mentioned was that a few markers associated with cancer are elevated in marathon runners. The actual markers indicate inflammation and a leaky blood brain barrier, which could be caused by lots of things besides cancer.
I too am a huge fan of SQL, what mysql and postgres has given to the world and how it has inspired and set seeds in many of us to use open source, contribute to open source.
I am dipping my toes into couchdb, just to see what all the noise is about.
Still getting my head around map,reduce and the fact im writing in javascript. All that aside the biggest exciting factor for me.
It is so much easier to write custom functions for it than SQL. ( Mysql, and yes I only tried via phpmyadmin )
Uh, he said "fan of SQL Server", which is Microsoft's SQL product. Fans of SQL Server tend not to be fans of mySQL at all, since SQL Server makes mySQL look like a "bottom-feeder" in the original poster's words. Not that I disagree.
What I am specifically after, methods to know that this one computer ( keep it simple ) has sent x 1,000 requests in a short time ( ie to quick to be human ).
Before some of you lay the blame purely on AT&T for having poor code.
Other scenarios which are similar but different.
Perhaps we want to use this to throttle api requests, or to tackle a brute force attempt on the login.