My name is Alex Shelkovnykov. I was born in Kiev, Ukraine.
I grew up in a lot of places: Toronto, Halifax, Chicago. I grew up in the suburbs. I grew up online.
In high school, I thought I would work for NASA. In college, I thought I would work writing graphics software. Instead, I worked on Machine Learning, then virtual OS interpreters, and then networking.
I've been to 27 countries on 6 continents, and played hockey in 5 of them. I speak 3 languages fluently, but I learn some of the local language wherever I go.
I'm a recreational athlete who regularly participates in various types of sport: conditioning (Weight Training, Swimming), competitive (Hockey, Badminton), and combat (Fencing, Muay Thai, Precision Shooting).
- C, C++
- Golang
- Rust
- Java, Kotlin, Scala, Groovy
- Shell scripting
- Apache Spark
- Apache Hadoop
- Git
- Linux, macOS
- Languages:
- English
- Russian
- French
- worked on short-term contracts and personal projects
- contributed to open-source projects
- tutored aspiring developers and coding bootcamp participants
- participated in hackathons and coding challenges
- refreshed and deepened expertise in various fields:
- networking
- cryptography
- blockchain technology
- worked on the development of a virtual operating system and the interpreters used to run it
- purely functional ("a deterministic operating function")
- non-preemptive
- kernel events exist as set of ordered ACID transactions
- system state stored in a single-level store
- native networking with guaranteed exactly-once message delivery
- written in C, Rust, and Nock / Hoon
- specific duties:
- wrote new Noun allocator, Nock interpreter, and persistent memory arena
- added native JSON (de)serialization support
- added Docker support and modified build flow to publish images to DockerHub
- wrote helper tools for externally injecting events and threaded tasks directly into kernel
- wrote usage guides and improved documentation
- supported new users during on-boarding
- senior member of the "AI Algorithms Foundation" team
- tech lead for the Photon ML open-source library
- interviewed applicants for junior & senior positions
- junior member of the "Machine Learning Algorithms" team
- worked as a developer and support for ML tools using:
- Apache Spark
- Apache Hadoop
- XGBoost
- intern member of the "Search, Network, and Analytics" team
- worked on libraries for performing machine learning in Apache Hadoop:
- data summarization
- anomaly detection
- evaluation metrics computation
Links
- worked on support for the Alembic file format in Houdini:
- reading / writing NURBS curves/surfaces
- calculating / storing visibility information
- developing a system for reading, modifying, and writing geometry transformation hierarchies
- converting Autodesk Maya camera objects to Houdini camera objects
Links
- intern member of the "Relevance" team
- worked as a "scripting handyman" for the team data scientists and data engineers:
- developed scripts to process large volumes of data to create training datasets
- developed tools to view, score, and compare results from multiple deployed models
Links