I wanted to download some course material on RL shared by the author via Google drive using the command line.  I got a bunch of stuff using wget a folder in google drive was a challenge. I looked it up in SO which gave me a hint but no solution. I installed gdown using pip and then used: gdown --folder --continue https://drive.google.com/drive/folders/1V9jAShWpccLvByv5S1DuOzo6GVvzd4LV if there are more than 50 files you need to use --remaining-ok and only get the first 50. In such a case its best to download using the folder using the UI and decompress locally. Decompressing from the command line created errors related to unicode but using the mac UI I decompressed without a glitch.

### How to search your youtube history from the command line in ?

So I eventually found that the personal Youtube search url is at:

A nice ui no doubt but I need to get at from the command line...

To avoid breaking the URI we must ensure the query text is url encoded so:

urlencode() {
# urlencode <string>
local length="${#1}" for (( i = 0; i < length; i++ )); do local c="${1:i:1}"
case $c in [a-zA-Z0-9.~_-]) printf "$c" ;;
*) printf '%%%02X' "'$c" esac done } with that: #youtube history command yth(){ urlencode | google-chrome "https://myactivity.google.com/myactivity?q='${*//[$'\t\r\n ']}'&restrict=ytw" }  p.s. all these go in a dotfile say at ~/Dotfiles/.functions and then sourced via: $source ~/Dotfiles/.functions

so to look up the legendary session "Willy Wonka of Containers - Jessie Frazelle"


I need only type:

\$yth Willy Wonka

And I instantly achieve container nirvana at #ContainerCamp.