My name is Alex Shelkovnykov. I was born in Kiev, Ukraine.
I grew up in a lot of places: Toronto, Halifax, Chicago. I grew up in the suburbs. I grew up online.
In high school, I thought I would work for NASA. In college, I thought I would work writing graphics software. Instead, I worked on Machine Learning, then virtual OS interpreters, and then networking.
I've been to 27 countries on 6 continents, and played hockey in 5 of them. I speak Russian fluently and French well, but I can also speak some Spanish, Japanese, and Thai.
I'm a recreational athlete who regularly participates in various types of sport: Weight Training, Ice Hockey, Swimming, Fencing, Muay Thai, Sport Shooting, and Badminton.
- worked short-term contracts, pursued personal projects, and tried to launch two startups
- contributed to open-source projects
- tutored coding bootcamp students and aspiring developers
- participated in hackathons and coding challenges
- refreshed and deepened expertise in various fields:
- networking
- cryptography
- blockchain technology
- worked on the development of a virtual operating system and the interpreters used to run it
- purely functional ("a deterministic operating function")
- non-preemptive
- kernel events exist as set of ordered ACID transactions
- system state stored in a single-level store
- native networking with guaranteed exactly-once message delivery
- written in C, Rust, and Nock / Hoon
- specific duties:
- wrote new Noun allocator, Nock interpreter, and persistent memory arena
- added native JSON (de)serialization support
- added Docker support and modified build flow to publish images to DockerHub
- wrote helper tools for externally injecting events and threaded tasks directly into kernel
- wrote usage guides and improved documentation
- supported new users during on-boarding
- senior member of the "AI Algorithms Foundation" team
- tech lead for the Photon ML open-source library
- interviewed applicants for junior & senior positions
- junior member of the "Machine Learning Algorithms" team
- worked as a developer and support for ML tools using:
- Apache Spark
- Apache Hadoop
- XGBoost
- intern member of the "Search, Network, and Analytics" team
- worked on libraries for performing machine learning in Apache Hadoop:
- data summarization
- anomaly detection
- evaluation metrics computation
Links
- worked on support for the Alembic file format in Houdini:
- reading / writing NURBS curves/surfaces
- calculating / storing visibility information
- developing a system for reading, modifying, and writing geometry transformation hierarchies
- converting Autodesk Maya camera objects to Houdini camera objects
Links
- intern member of the "Relevance" team
- worked as a "scripting handyman" for the team data scientists and data engineers:
- developed scripts to process large volumes of data to create training datasets
- developed tools to view, score, and compare results from multiple deployed models
Links