I'm a professional backlink builder and I work for a company for 2 years right now. One of the services I provide is Social Bookmarks. I have a "semi automatic" system that gets the job done but due to recent changes in my company I need to make the process 100% automated. No "almost" anymore. I use Bookmarking Demon to bookmark 200+ websites daily and I really need a software to fully automate my social bookmarking tasks.
Basically, the software must be able to pull in the websites and keywords combinations (via API GET Requests), paste it on Bookmarking Demon, wait until it is done, test the links created to see if they got built correct and submit the links created through an API PUT Request.
So, what the software have to do, step-by-step (it can be done in another ways but the end result must be the same) - I can provide you screenshots for all of this if you need to or even make a video of myself making it manually:
- It will send an API Get Request and receive a list of websites with a specific anchor text for each website (1-10 websites with 1 anchor text each). Since I use mainly Pligg websites to make the submissions and they don't accept URLs in the title (started with http:// or https://)
which is the anchor text, any website that have a URL as an anchor text (again, only if started with http:// or https://; www.website.com
is just fine) will be skipped and NOT pasted into Bookmarking Demon later. If all the websites have URLs as an anchor text, the whole order will be skipped and the order number will be saved into a text file so I can do it manually later using another software;
- In Bookmarking Demon there will be 2 groups of websites: My Websites and Other Websites. It must select all websites from "My Websites" and another random amount from the "Other Websites" group. It will use always the same amount if not edited but I need to be able to edit this random amount. After that it must paste all websites into Bookmarking Demon. Bookmarking Demon sometimes doesn't "recognize" and doesn't add the website to be bookmarked if the website page is offline so the software must know which websites got successfully added into the bookmarking list and edit their title (which is the anchor text). The description and tags are already added by Bookmarking Demon so no need to edit them;
- It will save the project, the submission will start and it will wait until at least X amount of links are created. I need to be able to edit this amount too. If Bookmarking Demon already finishes the submission and the minimum amount of bookmarks are not reached (due to many failed submissions) it must save the order number into a txt file, delete the project and just move on to the next order;
- It will then check the links built (maybe using Scrapebox Free Link Checker or Sick Link Checker or any free link checker you know) and send a X amount of the successful links (or "found" links) to my company sending an API PUT Request - I have a PHP Script for the request, only the order number and the links built are necessary. Again, the X amount of successful found links the software will send must be editable.
- After reporting the links built it must delete all the projects in Bookmarking Demon just to make sure it doesn't get the wrong links in the future.
A few things to consider:
- I use Bookmarking Demon on a VPS right now and I believe I should probably keep using it, right? So the software must check if my computer is connected to the VPS everytime a new order will start to be done and if it's not maybe due to my internet going off (it happens sometimes) it must connect again to the VPS;
- If anything at all goes wrong in the proccess the software must be able to identify it or at least make it not affect future orders. I mean, it should still function and continue to delivery the next orders without any problem;
- The software will use an API request to pull in the next order in my account, but if it is not able to deliver the order, an hour or so later the order will be available again so when it sends the request the same order will be pulled in. If it receives the same order for a second time it should try to deliver it again but if it receive an order 3 times (meaning it failed 2 times to deliver it) it must save the order number on a text file and start to skip it everytime it is pulled in again.
- Since any order that gets failed (meaning the software send the links built but they didn't contain a X amount of successfully built links) negatively affects my business ALL the links built must pass through the link checker before they get reported. It is a must. If the link checker fails to find the minimum amount of links I need to submit per order (the same amount the software will report back) it will NOT send them.
If there's anything you think you could do but in another way this is completely fine. As I said this is how I see the software working but if you make it successfully deliver the orders using other ways to do so, no problem at all. What matters is that it pull in new link building orders, build those links and submit a report back.
Also I have no idea at all how much a software like this will cost me so I'm completely open to offers.
PS: The response it will get after sending the GET Request to the API can be received both in XML or JSON, whatever you prefer.
Thank you for your attention
API GET Request Response:
API PUT Request Sender:
Bookmarking Demon Interface:
Payment method/ details: PayPal