downloading folders from google drive.

I wanted to download some course material on RL shared by the author via Google drive using the command line.  I got a bunch of stuff using wget a folder in google drive was a challenge. I looked it up in SO which gave me a hint but no solution. I installed gdown using pip and then used: gdown --folder --continue https://drive.google.com/drive/folders/1V9jAShWpccLvByv5S1DuOzo6GVvzd4LV if there are more than 50 files you need to use --remaining-ok and only get the first 50. In such a case its best to download using the folder using the UI and decompress locally. Decompressing from the command line created errors related to unicode but using the mac UI I decompressed without a glitch.

The Happy winners #nlihack 2017

About the event 

The happy winners!
Amir Aharoi, Oren Bochman and Chaim Cohen
Last week (November 23-24 2017) I had the pleasure of participating in the first National Library of Israel's Hackathon. I've been to the NLI a few times with friends from the Wikimedia movement to instruct its staff and students about editing Wikipedia. But at the hackathon, the NLI opened its doors to the best and brightest minds to help out with tagging content and dissemination of its extensive image database.

The Team

You can't win a hackathon without a great team. My team consisted of seven developers which have been a part of the core community of Wikimedia developers in Israel and have been meeting irregularly since the International Wikimedia hackathon Organized by Wikimedia Israel last year in Israel.  We had met about a week before the event at the Local chapter's offices and discussed over pizza what we wanted to do and what the NLI had asked us to do. I realized that the most wanted task required a longer-term commitment and possibly discussion with NLI staff on a suitable upgrade of the commons mass upload solutions.

p.s. I'll be adding a note on the team members ASAP.

Oren Bochman presenting the deep learning model

The Ideas 


Another suggestion had been to learn categories via text description in the image meta data. I had a related yet more ambitious idea which was to retrain some state of the art models such as inception and FaceNet (also based on inception) to learn the contents of the images in the database. I thought that this was a viable idea since it was based on deep learning tasks using TensorFlow and its higher level interface Keras with which I as well as other team members had had considerable previous experience. The only caveat was that it would require considerable computing power than my laptop could produce to train.

The second idea which was due to the esteemed Amir Aharoni was to split up manuscripts using a heuristic and then use a chat bot to crowd-source people to convert it to text. Half the team tried this challenge which complemented my idea since any ML task benefits from crowd sourcing a suitable data set!

The Story

The incredible Chaim Cohen presenting the overall concept
The actual hackathon started out as a bust. We had practicably no internet access for a number of hours. After about six hours a number of complaints we were relocated but there were additional setbacks. The scripts I had located for downloading categories from Wikimedia commons using the Pywikibot framework proved incompatible with the more recent version and needed to be rewritten from scratch. Also interfacing with NLI databases search and high-quality images proved more tricky than we had initially expected. Last but not least was the fact that we had no idea what categories of images to train on. It turned out that pre-trained inception was no good with people. FaceNet needs alignment using a face-detector like open-CV or its own model. Finally chatting with the other team members showed they had been stalled and could not help with our challenges. At this point, I was ready to give up and go to sleep but Chaim was inspired and after talking to a Librarian during a night shift we decided to give everything a second chance both data collection and improving the models. I rebuilt the tensor flow from the source and then retrained the models and we started seeing much better outcomes as we got more images to work with.

The Presentation

You can have a perfect project but the presentation is what wins the hackathon. I took a nap and when I got up there were three hours on the clock so we got to the worth of the presentation where again Mr. Cohen played a vital role. He told the store to create a logo and infused his enthusiasm about our idea to the whole crowd. But to my surprise, Chaim insisted that two more team members stand on the stage and present and we did!

Our presentation

References 









Comments

Popular posts from this blog

Moodle <=< Mediawiki SUL integration - first thoughts

downloading folders from google drive.

Big Data Analytics Israel - New Year, New Data Scientist Job: 5 Things To Think About