Posted on August 09, 2016 by Nikolay Arefiev
Web crawler visit frequency and depth are calculated by a set of adaptive algorithms that the search engine developers prefer not to disclose. When you run an advertisement campaign to promote your goods or services over the Internet, or even just update your website content, it is important to understand in what way and how fast these changes will be seen by the search engines.
Posted on July 21, 2016 by Yury Sergeev
Hackers are using new sophisticated botnet-driven SEO attacks to promote adult sites and online pharmacies. Web search engines use different algorithms to rank a site, but one of the most significant parameters is how many sites contain links to the website, and how high-ranked are these sites. Therefore, attackers exploit hundreds of websites to increase the search engine ranking of sites which they promote by injecting links.
Posted on July 07, 2016 by Yury Sergeev
Today, online business is affected by numerous attacks. Some of them are performed by people and some by bots which are simply software applications that run different automated tasks to compromise websites. However, both use dynamic IP rotation as a way to evade security solutions using Tor networks and open proxies to disguise their real IP addresses. In that case, as defenders, we should use threat intelligence data to check IP reputation of visitors and if we have special software to protect web applications from attacks, configure our security systems to block malicious human and bot traffic.
Posted on May 28, 2016 by Yury Sergeev
If you have a web app running on a web server, you know that it is so helpful to analyse the access and error logs to get information about how it is functioning. Systems administrators use different tools to analyse HTTP statistics, but not all of them provide enough information. For instance, Google Analytics can't provide client IP addresses which might be really useful to have then you want to investigate what was going on. By looking into a website access log it is possible to extract client IP addresses and analyse their reputation. Therefore, it is essential to use special instruments for log analysis. As an illustration, RST Cloud shows which malicious sources that had repeatedly attacked other websites contacted your web server.
Posted on December 18, 2015 by Yury Sergeev
Every website, irrespective of its’ popularity, gets attacked almost every day. A huge number of security bots constantly scan random websites trying to find vulnerabilities that can be used for intrusion. These vulnerabilities are then used by malefactors to manually hack the compromised website.
Recent PostsUnpathed Critical Vulnerability in Magento