A program that will collate all the IPs in a certain website or group of websites and from a certain file. These IPs, once collated will be evaluated by applying whois on each IP found on the certain base.Evaluation results will be in a form of a table that will be written on a calc file.
The Origin
I was instructed to collate all the IPs from a single calc file. Once gathered need to apply whois on each single IP. This task might be easy if you're only given a not more than 20 IPs. But the calc file has thousands, and work would be so hedious in a way you would perform single check on each IP, provide whois results, and IPs must not have duplicates.
With the knowledge on python, I then design a program that will do the heavy tasks.
The Process
With the ideas already intact, time for the design!In terms of data storage, just like what I did on some of my previous tools, I'll make use of the text file for now. I also decided to not only parse from a file but also have it parse on a webpage. So this will be the flow.
<a>User will choose to where he will collate the IPs, web or file.
********Search IP on**********
1. Web
2. File
3. Exit
*****************************
""")
#input variable syntax
if ans == "1":
webexec()
elif ans == "2":
file_exec()
elif ans == "3":
exit()
*If user decides to collect all the IPs on certain URL or websites, user must fill in search_input.txt. That file would contain all the sites you need to parse.
*If user decides to collate the IP from a certain file, program will prompt the user to provide the exact file directory.
*Either of the 2, process will continue @b
User may decide to exit, by choosing 3.
<b>IP parser will be executed. Will parse all the IPs from web or file. Result of 1st parse will be stored on ip_all.txt.
<c>Having all the IPs parsed in a single file. Its time for the second check. In this process, it will get rid of all the duplicate IP and conduct 2nd IP valiation. Results will be printed on ip_all2.txt.
<d>Having all the IP collected on ip_all2.txt, it's now time for evaluation. The execution of whois in each IP will be performed, thus generating a JSON format as result. This could take time depending on the volume of IPs that will be evaluated.
<e>Once Whois execution to JSON has been done, by csv.writer JSON data(result_ini.json) will be parsed to ip_whoisF.csv. Data will now then be transported to the said csv file.
<f>Process would then halt. And would go back to main menu.
The Walkthrough
<still on going...>
コメント