In your esteemed opinion, how important is it for people working on a video game together to be able to use a revision control system to coordinate their work? How many of you have, in your gaming work, been able to get away without using one? Do you think it's less important for artists? How about artists who not only produce art (graphics, sound, etc.) but also write code/scripts?
The answers below are in no particular order. For some I've cleaned up the grammar a little. I've also "anonymized" the people involved because I wasn't sure how they'd like being quoted in public. But rest assured that they are all experienced software developers who've done at least several gaming projects. Let's start with JB:
This is beyond question. Just no question. ... I would never work without a revision control system, even working by myself. Working with a team, running the committed code through automated tests every night, (yelling at the person who broke the code is beyond satisfying), staying in sync, knowing that everything everyone is doing will continuously still work together—mandatory. You will encounter crunch-time bugs that can only be found by rolling back trough revisions until the bug doesn't exist. And you want to be creative and unlimited? Branching ... so you can try out something radical is priceless! ... You will learn why you need revision control systems eventually, no matter how thick you are. But at the end of the day, you want to work for a good company with excellent people, and they aren't going to want you unless you are on board with revision control systems. End of story. It's a career requirement. So suck it up buttercup.
Next we have AD with a slightly different opinion:
I would not force students who are resisting. Things like Dropbox provide data sharing which is most of what 1–2 programmer teams need. If they lose their code you can always make fun of them. Personally I always use revision control but my games are very code-centric and I came into game development with a software engineering background. One of the things I like about games is that nobody cares about code or tools—only results. Most software engineers are evaluated by other software engineers and a lot of quasi-religious groupthink comes out of that. Game development is a reality check: Can I really use these software skills to produce something normal people want? Software engineering classes are a better place to force students to learn about revision control, the game development kids should focus on making fun games with minimum friction.
I should say that the student teams in our course are 4–5 students, not 1–2 as AD had assumed. And please note that AD still holds students responsible in case their ad-hoc approach fails. Next we have DC:
For the past few weeks I've been working on a simulator entirely on my own and I'm still kicking myself for not using version control: I've noticed a change in the outputs in the last couple days that is inexplicable and I don't have a recent snapshot I can go back to to track down what caused it. Certainly I've never worked at a game company, or heard about any acquaintances doing so, that failed to use version control. It's a universal standard. Should it be required of the students? Arguments in both directions: (a) using it gets them experience with an industry standard; (b) not using it will force them to confront the problems that crop up, and deeply understand why they want version control in the future. ... I usually side with (a) although I was just reading a part of the JHU orientation which ... emphasizes students learning more on their own.
It's true that we try to encourage "learning on your own" and I guess my courses are particularly "infamous" in that regard. However, I would still hold that it's appropriate to force students to do it, in a company environment they most likely would be "forced" as well. But hey, now I am throwing too much of my own perspective in here. Let's get to AS:
It's a requirement!
That's certainly the most succinct reply I received. So on to SC:
I'd say it's a requirement for the code base at the very least. Git or Perforce can be punishing with Unity files, but it's ultimately best. Art I'm more flexible with. An old job tried to handle art with Git and it was bad. Currently our artists maintain stuff on their own Perforce server (since it's unlikely that more than one artist is working on any given asset) and then upon completion things move to the Unity repository.
The point here is that art assets are hard to merge automatically and so a system that allows "locking" is preferable over one that forces merging. Note, however, that there's still a revision control system in place, albeit a different one. I don't have a way to give students Perforce licenses, but maybe Fossil 2.0 will be an option in the future. Of course Subversion supports locking as well, it just feels slightly antiquated these days. Next we have BR:
I was able to get away without one... In Nineteen Ninety Freaking Six!
Speaks for itself, doesn't it? On to NMC's opinion:
Sometimes, for a small project with only one person working on it, you can get away just periodically taking your entire project folder and compressing it to a zip file. This is barely adequate for one person working alone on a recreational project, and even in that situation version control would make your life easier.
I think I am starting to see a pattern here, don't you? Finally we have JC (who refers back to BR above):
Is it essential for every project? No. Is it useful for every project? Yes. A career hallmark of game development (and maybe the tech industry itself) is enjoying learning new things constantly. There is always a better way. Either you love that or you think you've learned enough. The people who think they know enough don't tend to last or have passed into a phase of their career that likely won't last in a satisfying way. Our current view is Git (or Hg) for code and Perforce for content. They just make life easier. For the size of Unity projects you will encounter in college, Git (or Hg) alone is totally fine.
Git is something that everyone hates and don't understand if they use it rarely. If they use it regularly, they usually love it. It's crazy fast and lets you do all the things you want to do. Perforce is nice in that it is rock-solid and handles large binaries well. But you also lose information everytime you merge (unlike a distributed system) and branching is painful, particularly after the fact. And, of course, it isn't free ...
I probably agree with BR: let them do whatever but no excuses accepted for lost work. That is the reality. But it's definitely a plus for us as a company when an entry-level candidate has real experience with the major source control systems. For one, it's just a skill like anything else. But, second, it gives some indication that they've done enough work to understand the value.
On a side note: BR "got away with it" until I was hand-merging changes from him and SM every day ... in addition to writing the core game systems. To his credit, BR fully supported the change. And, agh, we used SourceSafe. That I would not recommend.
Seems to me that this (biased?) sample of opinions tends to agree with my feeling that every student should learn how to work with a revision control system. So I for my part feel "vindicated" as it were.
If you're a student in one of my project courses and you hate the idea of learning a revision control system, re-read these comments from industry professionals a few times. Then ask yourself if you'd rather be able to say, truthfully, that you learned something like git or if you'd rather apologize that you didn't. Your call, but I am almost certain that your answer will have an effect on your chances of getting hired.
(I'd like to thank everybody who answered my question back when I posted it on Facebook. You're the best!)