Current Projects:

  1. A global grant program parser that translates scientific language into business-friendly terms, making complex research more accessible to investors.
  2. A data-driven political CRM system designed to analyze voter data and enhance voter engagement strategies.
  3. A pet-project focused on developing a voice recorder set with LLM for personal use.

Behind the Code: A Personal Journey

First Encounter

I remember that winter day in 1993 as if it were yesterday. My dad brought home several beautiful boxes and started connecting them with wires. It was a mesmerizing sight. This machine brought into our new yet Soviet-style apartment, looked like it was created by an alien civilization. It was a turning point in my childhood—and probably my entire life. My dad brought home a 386 PC with a 40 MHz processor and a 120 MB hard drive. This marvel of technology ran on DOS and soon after on Windows 3.11.

Naturally, the first thing that caught my interest was the games: Prince of Persia, Loom… But the problem was that there was never enough space on the hard drive. My dad only allowed me to transfer games from floppy disks to the hard drive on rare occasions.

By default, however, Turbo Basic, QBasic, and Pascal were always installed on the computer, with books on these programming languages lying on the table. Since I didn’t really care what I was doing as long as I was sitting at the computer, I started entertaining myself by exploring programming languages.

Essentially, I just copied code listings, ran them, and then looked for things to tweak. And, of course, the legendary Gorillas written in QBasic. First, it was fun to play. Second, its source code was an endless source of amazing “transformers” for my experiments. I learned how to output music through the speaker, animate pseudo-graphics, and use colors — all thanks to that game.

This was how I spent my childhood. School distracted me from this passion for a while. Or rather, I got into sports because it seemed like the only way to avoid going to school. For some time, computers took a backseat.

Gorillas 1991

Gorillas 1991. QBasic.

Pseudo-Disappointment

In the early years of university, we had programming courses that felt completely irrelevant, boring, and useless. At university, I became deeply fascinated by algebra and physics, far more than programming.

By my third year at university, I decided to fully and radically dedicate myself to theoretical physics. In practice, this meant that with my characteristic radicalism, I decided to rely entirely on analytical calculations, completely abandoning numerical methods. I even defended my thesis after completing all calculations entirely by hand on paper, without the use of any software. Even now, my experience and habit of pushing through analytical expressions as long as possible remain valuable skills for a theoretical physicist.

However, over time, I realized how shortsighted it was to resist technological progress. Today, I understand that the ability to choose the right numerical algorithm for efficiently obtaining results is an essential skill for any theoretical physicist.

Renaissance

For a long time, I only worked with numerical methods, mostly using C/C++ with various scientific libraries like GSL, ROOT, and LAPACK.

However, everything shifted dramatically in 2017. A close friend asked me to tutor his son in STEM. He had just entered PACE University to study IT and was looking for a mentor. We started with algorithms but quickly moved on to developing Android apps, building web applications in JavaScript, and solving graph theory problems.

At that moment, I realized just how dramatically the ways to “play with code” had expanded since my first encounter with programming.

Some of them I retrieved from the archives:

  • JS Spiral
  • Android game .apk (coming soon)
  • GraphAlgorithms with NetworkX (coming soon)

Today

In the last 30 years, I’ve coded in QBasic, Kotlin, Fortran, Julia, plain C, TypeScript…

Today, I mostly write in Python, though I still prefer pure C for scientific work—often wrapped in Python for visualization.

At the same time, I strive to stay up-to-date and incorporate new technologies into my work—particularly in deployment and infrastructure.

In venture capital, I’m particularly interested in data-driven decision-making: analyzing data, building hypotheses, and using neural networks for decision-making. I’m also focused on applying and fine-tuning LLMs, especially local models.

In my political research, I use Python, SQL, Mapbox, and JavaScript.

People

People who surrounded me have played a phenomenally important role in my life. Pete Kvitek, a close family friend, was one of them. We met when I was eight. He would visit us when he was in Moscow and seemed like an absolute guru and oracle (which, in fact, he was). When he talked about his involvement with FidoNet, working with Microsoft, and meeting Bill Gates, I felt like the luckiest kid on earth.

It was Pete who nurtured my interest not only in the technical aspects (I still consider Pete one of the best programmers of our time) but also in business infrastructure. I remember how, already as one of Evernote’s founders, he told stories of the ups and downs of Silicon Valley, the dot-com paradox, and the concept of venture capital.

By the way, I even had a chance to work at Evernote in 2005. To support me as a student, Pete recommended me to their office, where I started working as a tester for the mobile version of Evernote.

What’s interesting is that this was before cloud storage was mainstream, before the iPhone was unveiled. Evernote wasn’t even the green elephant yet; it was still called EverNote (with big N). I was testing the synchronization of photos taken on Windows Mobile 2003 with the desktop version of EverNote. Thanks to Pete, I was among the first lucky ones in the world to experience the magic of cloud sync.

Evernote in 2005

Evernote in 2005. Already had a cloud sync.