Quick fire code style pool:

Q: Tabs vs spaces?
Tabs, faster to delete.

Q: Long function names vs short?
Precise names.

Q: Music while programming?
Sometimes, instrumental.

Q: Preferred editor/IDE/secret weapon?
VS Code, Google.

Q: Neat desk or a creative chaos?
Whatever gets the job done.

Q: Coffee?
Nope, tea. Chamomile today.

It has been six months since Emil joined us, so I wanted to have a sit down with him and do a short recap.

Mind you, Emil is not your typical junior developer. He already had a lot of experience when he joined Syrmia. He is full of interesting stories, which I enjoy listening, be it over a cup of coffee or tea.

Q: You went to Mathematical Grammar School, one of the best, if not the best in Serbia?
My focus was on physics at that time, while programming was mostly a hobby.

While there I participated in a Science Olympiad. We went to Abuja, Nigeria. It was quite memorable for few reasons. First, it was dry 40°C in December! It was also a period when a war has just started, so wherever we went, we were escorted by military personnel. We couldn’t visit much of the city, but we got to hang with other Olympians from all over the world. I won a silver medal and was the best ranked member from the Serbian team.

Q: You also participated in RoboMac while on your studies, how did that go?
RoboMac is a competition in robotics and artificial intelligence (AI). My colleague and I were focused on the AI part. We were in Macedonia for a week, writing an AI player for a 2D arcade game. It was quite intensive. Game itself was buggy, so we were fixing it while in parallel working on our solution.

Our bot player was different. As a basis we used a neural network. We then applied genetic algorithms to train the network. Simply, we let our player evolve by playing many games against slightly different versions of itself. Then, we would take winners, generate their offspring and repeat the process. This way, our player got better and better. We won first place and went home undefeated.

Q: Afterwards things started to speed up. You had an internship at Microsoft Development Center in Belgrade, then one at Nvidia in Santa Clara, and managed to finish your bachelor’s thesis - all in the course of 9 months!
At Microsoft we worked on a language processing tool for one of the Microsoft Office programs. We created a working prototype but did not have enough user data to make it into a product then.

Nvidia was completely different. Arriving in California was a culture shock. People there are mostly focused on work. It’s different from Europe. They are very professional, but I got a feeling that we always talked about work, even at social gatherings and I didn’t really like that. They are of course very professional and devoted, and atmosphere is very corporate.

America in general was full of contrasts for me. Not sure how to describe it. You have a large prism of different people, all in one place. It all works somehow, although it looks like it shouldn’t.

Q: And then, as soon as you came back from California, you re-packed your bags, and next week started Master studies at Cambridge?
I was determined to do my master’s degree abroad. I applied on several top universities in Europe. Cambridge was the one where I got a scholarship, so I went for it. I specialized in machine learning and obtained education I was not able to get at home.

Q: How was it to study at Cambridge, compared to Belgrade University?
It’s hard to compare bachelor and master studies. Also, Cambridge is an elite university, so it’s probably not a fair comparison.

Organization of courses and studies, as a whole, was better there. They are more student oriented. While our colleges value pure theoretical knowledge more, at Cambridge most exams are completed by studying a certain scientific topic and then doing practical homework in that area. You are given resources and left to roam. Everybody gets the same questions, and nobody even thinks of plagiarizing. In a sense, they expect you are there to gain knowledge, not a title, or a piece of paper.

coffee break Emil, Junior Eng., Graphics Virtualization Team

Q: After Cambridge, you returned to Serbia, and joined us at Syrmia. How did that happen?
One of the terms of my scholarship was that I should come back to Serbia for at least two years. I looked at different companies and chose Syrmia as there was a good overlap between my interests and projects done here – mainly self-driving cars, system software, machine learning… After looking at different teams, I decided on graphics card virtualization.

Q: Can you explain what GPU virtualization is?
Imagine moving graphics processing to the cloud. Instead of buying a very expensive PCs you would pay a fee based on how much graphics resources you use. This would mean that you would be able to play AAA games on a low-cost laptops or even mobile phones.

Q: How does that work?
In simple terms, there is a large number of high-end, specialized GPU cards in a data center somewhere, which are all shared among users. Each user gets just a part of the graphics card to run its software. This should be cheaper for the user – instead of buying an expensive card, you only pay a fee for the resources used.

Q: What is your team working on right now?
We are currently designing and developing a UI application that will allow people managing data centers to easily work with a large number of GPUs. It is important to figure out all use cases and understand which features are important to the user. These are all novel technologies, so there aren’t any best practices yet.

Q: Where do you see this technology being applied?
Like GPU development, gaming industry is currently driving it. One example is Google Stadia – a new gaming platform that will do all graphics processing in the cloud and then stream game contents to users. So, a player can join a game from any device – phone, PC, tablet, and experience should be completely the same. Graphics should always be at the top level, with all effects turned on.

Q: Apart from gaming, where do you see this technology being applied?
Anywhere where graphics computing resources are needed. Some examples are deep learning, scientific computing and scientific research in general. Time will tell, but it might prove very disruptive - change the way all of us are using computers. For example, instead of having a work computer and a home computer, in the future you might have two instances running in the cloud and use any low-power computer in the world to connect to them.