Commands using whois (16)

  • Returns nothing if the domain exists and 'No match for domain.com' otherwise.


    7
    whois domainnametocheck.com | grep match
    Timothee · 2009-08-11 13:33:25 17
  • A quick alias to check if a domain is already registered or if it's available for purchase. Show Sample Output


    5
    function canibuy { whois "$1" 2>/dev/null | grep -q 'Registrant' && echo "taken" || echo "available" }
    tyzbit · 2023-06-21 15:01:25 181

  • 4
    ASN=32934; for s in $(whois -H -h riswhois.ripe.net -- -F -K -i $ASN | grep -v "^$" | grep -v "^%" | awk '{ print $2 }' ); do echo " blocking $s"; sudo iptables -A INPUT -s $s -j REJECT &> /dev/null || sudo ip6tables -A INPUT -s $s -j REJECT; done
    koppi · 2016-04-08 11:30:12 61
  • Create a text file called domainlist.txt with a domain per line, then run the command above. All registries are a little different, so play around with the command. Should produce a list of domains and their expirations date. I am responsible for my companies domains and have a dozen or so myself, so this is a quick check if I overlooked any.


    3
    cat domainlist.txt | while read line; do echo -ne $line; whois $line | grep Expiration ; done | sed 's:Expiration Date::'
    netsaint · 2010-05-02 06:49:09 6
  • Useful if you f.i. want to block/allow all connections from a certain provider which uses successive netnames for his ip blocks. In this example I used the german Deutsche Telekom which has DTAG-DIAL followed by a number as netname for the dial in pools. There are - as always ;) - different ways to do this. If you have seq available you can use net=DTAG-DIAL ; for i in `seq 1 30`; do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; done or without seq you can use bash brace expansion net=DTAG-DIAL ; for i in {1..30}; do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; done or if you like while better than for use something like net=DTAG-DIAL ; i=1 ; while true ; do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; test $i = 30 && break ; i=$(expr $i + 1) ; done and so on. Show Sample Output


    2
    net=DTAG-DIAL ; for (( i=1; i<30; i++ )); do whois -h whois.ripe.net $net$i | grep '^inetnum:' | sed "s;^.*:;$net$i;" ; done
    drizzt · 2009-08-01 05:28:19 4
  • Change the $domain variable to whichever domain you wish to query. Works with the majority of whois info; for some that won't, you may have to compromise: domain=google.com; for a in $(whois $domain | grep "Domain servers in listed order:" --after 3 | grep -v "Domain servers in listed order:"); do echo ">>> Nameservers for $domain from $a Note that this doesn't work as well as the first one; if they have more than 3 nameservers, it won't hit them all. As the summary states, this can be useful for making sure the whois nameservers for a domain match the nameserver records (NS records) from the nameservers themselves. Show Sample Output


    2
    domain=google.com; for ns in $(whois $domain | awk -F: '/Name Server/{print $2}'); do echo ">>> Nameservers for $domain from $a <<<"; dig @$ns $domain ns +short; echo; done;
    laebshade · 2011-05-08 04:46:34 4

  • 1
    whois commandlinefu.com | grep -E '^\s{3}'
    ca9lar · 2019-04-09 21:09:30 76

  • 0
    while read line; do pais=$(whois "$line" | grep -E '[Cc]ountry') echo -n "IP=$line Pais=$pais" && echo done <listaip
    pathcl · 2010-10-25 15:39:50 32
  • Nice neat feedback showing contact infomation for as many domains as you wish to feed it. I used a list of domains, each one on a new line as supplied by our registar, as we needed to check they were all upto date and back them up as we are updating them all.


    0
    whois -H $(cat ./list_of_domains) | awk 'BEGIN{RS=""}/Registrant/,/Registration Service Provider:/ {print} END{print "----------------\n"}'
    djsmiley2k · 2011-01-11 12:55:34 3
  • Found on https://bitcointalk.org/index.php?topic=55520.0


    0
    for i in `wget -O url|grep '<a rel="nofollow"'|grep http|sed 's|.*<a rel="nofollow" class="[^"]\+" href="[^"]*https\?://\([^/]\+\)[^"]*">[^<]\+</a>.*|\1|'`;do if test -n "$(whois $i|grep -i godaddy)";then echo $i uses GoDaddy;fi;sleep 20;done
    coinbitsdotcom · 2011-12-24 19:12:18 5

  • 0
    cat domainlist.txt | while read line; do echo -ne $line; whois $line | grep Expiration ; done | sed 's:Expiration Date::'
    jun3337 · 2013-05-13 02:55:17 8
  • Retrieves AS route prefixes for IPv4 and IPv6, aggregates the routes to the minimal set, and adds netfilter rules to reject them. Relies on two helpers: IPv4 - "aggregate" by Joe Abley (package name 'aggregate'), IPv6 - "aggregate6" by Job Snijders ( https://github.com/job/aggregate6 ) Show Sample Output


    0
    ASN=32934; for IP in 4 6; do whois -h riswhois.ripe.net \!${IP/4/g}as${ASN} | sed -n '2 p' | tr \ \\n | aggregate${IP/4/} | while read NET; do ip${IP/4/}tables -I INPUT -S ${NET} -j REJECT; done; done
    iam_TJ · 2016-05-29 09:45:34 10
  • I don't know why you would want to echo "blocking ....", but my alternative is functionally equivalent with the extra echo.


    0
    ASN=32934;whois -H -h riswhois.ripe.net -- -F -K -i $ASN|awk '/^'$ASN'/ {if ($2 ~ /::/) {a="6"} else {a=""};b="sudo ip"a"tables -A INPUT -s "$2" -j REJECT"; print " blocking "$2;system(b)}'
    AndrewM · 2016-07-22 07:48:27 14
  • Outputs multiple whois from a plain text file.


    -2
    for domain in `cat list_of_domains.txt`; do echo $domain; whois $domain >> output.txt; done
    pathcl · 2010-02-15 17:13:45 7
  • This can be used in scripts, to find out the origin of target IP etc. Show Sample Output


    -5
    x=192.168.1.1; whois $x > $x.txt
    sxiii · 2011-01-17 03:33:49 9
  • It would be nice if commandlinefu.com had a better domain name. Will they pick one of the above; We'll see. Show Sample Output


    -11
    whois cmd.fu;whois cmdfu.com|grep -i cmdfu
    axelabs · 2009-02-19 08:57:50 8

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Getting a domain from url, ex: very nice to get url from squid access.log

Find the real procesor speed when you use CPU scaling [cpuspeed]
We don't use CPU scaling, but just in case you do, there is something interesting to note. If you look at the /proc/cpuinfo, the speed listed is current running speed of the processors and not the real speed of the chip.

Serve current directory tree at http://$HOSTNAME:8000/

Exclude inserting a table from a sql import
Starting with a large MySQL dump file (*.sql) remove any lines that have inserts for the specified table. Sometimes one or two tables are very large and uneeded, eg. log tables. To exclude multiple tables you can get fancy with sed, or just run the command again on subsequently generated files.

Detect illegal access to kernel space, potentially useful for Meltdown detection
Based on capsule8 agent examples, not rigorously tested

Convert all JPEG images to MP4
Source: http://superuser.com/a/624574

Convert JSON to YAML
Convert JSON to YAML. Note that you'll need to have PyYaml installed.

Reboot as a different OS in Grub
This will reboot as the Grub 2 option.

Listing directory content of a directory with a lot of entries
Ever wanted to get the directory content with 'ls' or 'find' and had to wait minutes until something was printed? Perl to the rescue. The one-liner above(redirected to a file) took less than five seconds to run in a directory with more man 2 million files. One can adapt it to e.g. delete files that match a certain pattern.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: