Crawl 1.000.000 Links
|
|||||
---|---|---|---|---|---|
Author | Message | ||||
bladex21
Tenderfoot
Posts: 12
|
Posted: Thu Feb 01, 2018 8:15 am Hello I have a website it has 1.000.000 Links most are no there any more. I managed to crawl it one time. But the program just crash and it takes more then 24 hours to run. Please advice. |
||||
LinkAssistant
![]() Site Admin
Posts: 5459
|
Posted: Fri Feb 09, 2018 2:45 pm You need to increase the memory limit as described here: https://www.link-assistant.com/support/ ... powersuite _________________ Like SEO PowerSuite? PM if you don't mind recording a 20-second video testimonial for us See a spammer? Click "Report this Post" (bottom right) and help keep our forum clean! |
||||
bladex21
Tenderfoot
Posts: 12
|
Posted: Tue Feb 20, 2018 1:59 pm That made it so SG wouldn't start |
||||
LinkAssistant
![]() Site Admin
Posts: 5459
|
Posted: Tue Feb 20, 2018 2:38 pm Quote: That made it so SG wouldn't start
Most probably you are using 32-bit Java. Please, visit www.java.com and install 64-bit Java version, this should solve the problem.
_________________ Like SEO PowerSuite? PM if you don't mind recording a 20-second video testimonial for us See a spammer? Click "Report this Post" (bottom right) and help keep our forum clean! |
||||
sinelogixtech
Small God
Posts: 458
|
Posted: Fri Apr 13, 2018 5:36 am If you have site map index with 1.000.000 URLs Google will never index them all case when the sitemap gets resubmitted that cases Google to go back and re-crawl already indexed URLs and by the time it gets to the end of the already indexed URLs the sitemap is resubmitted again. _________________ ecommerce website company in india | website designing bangalore | bangalore website design company |
||||
|