Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 9 of 9
  1. #1
    Regular Coder
    Join Date
    Oct 2009
    Posts
    434
    Thanks
    7
    Thanked 3 Times in 3 Posts

    securing ajax so only the page that called it can get results.

    I have noticed an increase in hits to just my getresults.php file but no where near as many hits to the page that should be calling the getresults ajax file.

    I also added some code to my ajax file to record the requests made and found that there are a lot of completely random requests. Some of which are completely out of character and no where near what should be entered in the auto complete fields on the main public webpage.

    My page asks for a postcode and then the road and the road field shows suggestions based on the characters entered after the third one is entered.

    But my ajax file getresults.php which is called from the road field in the main webpage, is hit 1000's more than the main home page. which lead me to thinking that someone hijacked the page to grab all data in the database. This seem to be the case. As the search terms used are no where near what they should be and cause a fair amount of load from multiple ip's. when they do get close then my 'like' search on mysql allows the database content to show results and there are loads being sent back again causing load on the server. The site is not that popular enough for this sort of load or interest and i really need to find a way to stop people that use a dummy form on their own site and having the POST sent to my getresulsts.php page.

    What methods would people suggest to use to prevent someone doing this unless they first came from the main page that my form is on. Using a session to store the date might work for a moment or two but all they do is visit the main page and keep it open and then go back to their own page they created and continue where they left off from. I know and I tried this from a different site I have and i was able to hack in and get pages of results from search terms that i know would be in the tables.

  • #2
    Supreme Master coder! glenngv's Avatar
    Join Date
    Jun 2002
    Location
    Philippines
    Posts
    11,047
    Thanks
    0
    Thanked 251 Times in 247 Posts
    People don't have to use a dummy form of their own to submit it to your site. They can simply open the browser dev tools while they are in your site and manipulate the html and script from there.

  • #3
    Regular Coder
    Join Date
    Oct 2009
    Posts
    434
    Thanks
    7
    Thanked 3 Times in 3 Posts
    You know what I use those dev tools and never thought about it like that. Totally opened my eyes to a few of my other pages.

    So how could I prevent a robot or human from access my getresults.php file unless they actually came from the main page that my form is on?

  • #4
    Master Coder felgall's Avatar
    Join Date
    Sep 2005
    Location
    Sydney, Australia
    Posts
    6,640
    Thanks
    0
    Thanked 649 Times in 639 Posts
    There is no way of telling where they came from - the one header that can provide that information can also be set using the developer tools so that it looks like they came from your main page.

    The best that you can do is to thoroughly test the data that your script is receiving to determine if the values passed to it are meaningful. If the script receives valid data then you would need to assume that the request is valid and process it.

    If you have a CAPTCHA in the form then the expected values will include whatever that CAPTCHA passes to the server for validation. For example if you use a time CAPTCHA then one of the fields will contain an encrypted value representing the time that the form was first loaded so that you can compare it to the current time to verify theat the form wasn't submitted too quickly for it to be a person filling out the form. So to emulate a genuine submission they'd need to do sufficient analysis of your form to work out what the hidden field represents and to identify which particular encryption you are using so as to be able to generate their own valid value for that field. That would just about guarantee that if the field contained a valid value that it was a genuine submission from the main page.
    Stephen
    Learn Modern JavaScript - http://javascriptexample.net/
    Helping others to solve their computer problem at http://www.felgall.com/

    Don't forget to start your JavaScript code with "use strict"; which makes it easier to find errors in your code.

  • #5
    Regular Coder
    Join Date
    Oct 2009
    Posts
    434
    Thanks
    7
    Thanked 3 Times in 3 Posts
    Captcha would not work as it is an ajax request

  • #6
    Regular Coder
    Join Date
    Jan 2013
    Location
    Sunnyvale, CA
    Posts
    104
    Thanks
    6
    Thanked 7 Times in 7 Posts
    I do it all the time.

    Your server that exposes the SOAP needs to authenticate the request prior to processing the response.

    I rotate non-sequential cryptographic nonces (i.e. single-use, multi-character keys that may only be deployed once) in my solution every five minutes to assure security.

    Okay, I can see this going to be complicated, so I'll elaborate and try to keep it simple:

    1. Add a table to your database called tblNonce with the field nonId (integer).
    2. add a hidden input nonce field to your html page ( i.e. < input type="hidden" id="nonce" >).
    3. When your web server initially serves your site's web page, have your database store a new value in tblNonce, and ensure that your web server primes your html page's nonce field with that value.
    4. Every time you send an AJAX request from that page to your SOAP function, include the nonce with your request, and have your SOAP function check the database to see if the nonce is valid. If it is valid then replace it with a new [random/non-sequential] nonce, and update your html page's nonce field with the new nonce when you process the AJAX response. If your application will be used by many users then you'll obviously require a nonce for each user, and your database's nonce table will require a userId field as well.

    Since hackers do not possess the initial nonce they will never possess valid credentials to access your restricted function. Changing the nonce every number of minutes ensures that even if it is hacked, by the time it has been decrypted it will be useless.

    Bare in mind that anyone who has privileges to access your main site page will have access to the initial nonce, and thereafter may call your function freely. Hence, to be truly effective, you need to authenticate the request for the initial HTML page. Also, if you do not deploy this as secure HTTP then communications may be easily intercepted and the nonce quickly recovered, so consider your solution's architecture and secure it accordingly.

  • #7
    Regular Coder
    Join Date
    Oct 2009
    Posts
    434
    Thanks
    7
    Thanked 3 Times in 3 Posts
    thoughts on this idea...

    visitor, bot or otherwise comes to the site. and clicks contact link they are taken to the contact page and only on the contact page is a random hash created and stored in a session variable. This hash will stay active until the page is refreshed or the session is older than 5 minutes where the server will delete all sessions stored in my own sessions folder that are older than 5 minutes. (this feature is already active)

    The visitor then fills out the form and comes to the ajax field and starts to type in it. The session should in theory stay with the visitor even for the ajax requests ?
    The ajax (getresulsts.php) that finds the results to show, first checks that the session ISSET, before processing the request.

    Is this something that could work, any flaws with this method?

    Yes I know that all the visitor would need to do if they wanted was to refresh the page and they are back in business to cause the extra load. But this would be after they waited 5 minutes. so no repeated requests every second or more from the actual bots.
    Last edited by needsomehelp; 04-02-2014 at 02:03 PM.

  • #8
    Supreme Master coder! glenngv's Avatar
    Join Date
    Jun 2002
    Location
    Philippines
    Posts
    11,047
    Thanks
    0
    Thanked 251 Times in 247 Posts
    A malicious script could still be executed from the dev tools to continuously send ajax to getresults.php until the 5-minute timeout is over. And then repeat after refreshing the page. Or it could even do this automatically from another page on your site in a separate window and continuously run the script without intervention.

  • #9
    Regular Coder
    Join Date
    Oct 2009
    Posts
    434
    Thanks
    7
    Thanked 3 Times in 3 Posts
    yes I know that's still possible. But from the point of view of a bot that does not refresh (or not always), most of the requests have at times been at the same time. Meaning I may get 500 to 600 requests in just one minute. The most a real person would ever need in just a 2 minutes slot to complete the form would anything from 200 to 300 ajax requests. But these bots could use anything from 1200 or more.


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •