BT
x Your opinion matters! Please fill in the InfoQ Survey about your reading habits!

From Java to Ruby: Risk

Posted by Bruce Tate on Aug 31, 2006 |

I wrote From Java to Ruby: Things Every Manager should Know not for programmers, but for technical decision makers. Ruby advocates have done an excellent job of helping new developers understand the intricacies of Ruby and the flagship Rails frameworks, but less information is available for managers and executives deciding between technologies. In the last article of this series, I discussed strategies for establishing a pilot project in a Java shop. In this article, I'll examine the changing risk profiles for Java and Ruby.

In the mainstream, "Ruby is risky" is a common perception, and for good reason. New languages are inherently risky. As Ruby on Rails moves closer to the mainstream, that risk will decrease, because you'll have access to a growing set of programmers, components (called gems or plug-ins), books, and business partners. You'll also see the mainstream opinion that "Java is safe". On this point, I strongly disagree. As any language bloats, the risk will often increase. To understand what's happening on this front today, it pays to examine Java's initial adoption.

Technology adoption profiles

Many analysts have described models for technology adoption. One of the most popular was defined in Iowa to describe the adoption of agricultural products, and was later described in the context of technology in a book called Crossing the Chasm by Geoffrey A. Moore. In it, Moore describes the technology adoption cycle in five distinct groups:

  • Technologists. This group is predisposed to adopt new technologies. Any promising technology can attract this group.
  • Early adopters. This group will adopt new technologies for competitive advantage, regardless of whether they are successful in the mainstream.
  • Pragmatists. This group will adopt technologies once they become mainstream, or have a steep enough growth curve to effectively assure widespread adoption.
  • Conservatives. This group will adopt technologies only after it becomes necessary.
  • Skeptics. This group may adopt very late, or may never adopt a given technology.

Moore argues that the key to technology adoption is getting pragmatists on board. Since pragmatists require mass adoption, this middling group wants to see other pragmatists use a technology before that group is willing to make a commitment. It's a catch-22. You can't get pragmatists without other pragmatists. For that reason, you'll often see a downward trend in a market acceptance curve after the early adopters are on board, but before the pragmatists. Moore called this downward trend the chasm, and this notion should be at the center of a risk discussion surrounding any new technology.

Moore's solution was to focus on crossing the chasm in stages. Normally, you can't cross the chasm with one big leap. You need to niche-market. Java did so by attacking the Internet clients first with Applets, and then moving into server-side computing, mobile, and other niches such as mobile computing and enterprise architectures.

In Beyond Java, I argue that the chasm for programming languages is especially severe. Most of us recognize that an investment in Lisp may lead to productivity gains, but also will make it more difficult to find programmers, education, libraries, and components. We'll also have to spend more than we'd like to do any significant integration. For this reason, the mass market will adopt a major new programming language only every ten years or so. You can easily see this trend in server-side programming languages. COBOL and Fortran emerged in 1954 and 1961, C in the early 1970), C++ in the mid 1980s, and Java in 1996. I'd throw C# into the mix as effectively a Java clone, though there's room for some argument. Many other languages emerged over this time, but none received the dominant adoption of those above. Risk is the overriding reason that so many resist new programming language adoption.

Java's risk profile

Java once had to overcome high risk. At the time, most server-side programming was in the C++ programming language. C++ was effectively a systems language, adapted to applications development. The C family succeeded in that space because client-server development--and user-interface development--demanded a combination of performance and flexibility that was not available in many languages of the time. To overcome the risk of adopting a new language, Java needed three conditions to be true:

  • C++ developers had to experience a high level of pain. Pointer arithmetic (combined with the lack of compile-time safety) led to difficult family of bugs. Memory management made leaks commonplace. C++ was simply too difficult for many applications developers. These problems increased the risk profile for C++.
  • Java needed to solve some problems that C++ could not. The Java language provided simplicity, portability, and libraries that C++ couldn't touch. These factors reduced the overall risk profile for Java, keeping teams smaller and radically improving productivity.
  • Java needed a catalyst. With the exploding Internet, applets embedded into NetScape provided a compelling reason for C developers to take a look at Java. The C++ like syntax simplified the transition. Java was able to quickly grab a massive community, and a Microsoft backlash escalated the transition.

Java's explosion was bigger than anything we'd ever seen since, and was much larger than anything we're likely to see in my lifetime, but the blueprint is clear. To establish a new language, the old language needs to be painful, the new language needs to overcome that pain in a compelling way, and finally rapidly accumulate a community through some catalyst.

Java got a foothold quickly as an Internet applications language on the client side. Though the toehold with applets was tenuous, Java quickly moved onto the server side because it offered features that application developers found useful, including:

  • Memory management
  • A cleaner inheritance model
  • Better features for object orientation
  • Portability
  • Internet libraries
  • Security

...and many others. In my opinion, Java is the most successful programming language of all time. Over time, through growth, Java became less risky, and eventually dominated the market for server-side Internet programming. Commercial investment, the pool of programmers, available education, open source frameworks, and many kinds of published information all drive risks down. The reason is intuitive and clear.

Risk associated with a programming language decreases dramatically with marketshare once the language crosses the chasm.

Java has had an amazingly successful run. But programming languages do not remain the state of the art indefinitely. All successful languages bloat, because they must adopt to the changing needs of their users. Successful programming languages cannot move as quickly as others because they must maintain a certain level of backward compatibility to satisfy a growing user base. As the technology lags and the language bloats, a different kind of risk profile emerges. For the new risk profile, risks related to marketshare decrease as risks based on the programmer's ability to effectively get work done increase.

So far, I've focused on the marketplace risks of an emerging technology. As Java reaches its 10th year, another kind of risk assessment becomes necessary. Many influential books, such as The Mythical Man Month, Death March, and Peopleware preach about a different kind of risk:

  • Poor productivity leads to larger teams and longer schedules
  • Risk increases with project length
  • Risk increases with the size of a team
  • Quality risks, measured in the numbers of bugs, increase with the size of a code base
  • Risk increases with cost
  • Integration costs increase with complexity

As a programming language--or even a programming paradigm--ages, the language will often slip in terms of productivity, and expressiveness, relative to the state of the art. Project teams will need to increase in size, and programmers will need to write more lines of code to solve the same problem. Both of these factors inherently increase risk. All of these factors lead to an inevitable conclusion.

Toward the end of market dominance, productivity risks associated with a language will increase relative to the state of the art.

Whether and how this happens within the Java language is the subject of intense debate. Certainly, Java remains the best language for solving a whole host of enterprise problems, such as very large projects, or those with certain demands such as two-phased commit or hardcore object-relational mapping. Java's commercial investment has never been stronger, and the community is at an all-time high. But cracks in the foundation may be beginning to appear.

Java's Enterprise JavaBeans framework, WS* style web services, and JEE have come under increasing criticism for complexity and sagging productivity. James Duncan Davidson, one of the fathers of the servlet, says Java is no longer as approachable as it once was. It's harder to educate a typical Java developer to solve the most common programming problems: database-backed web applications. Circumstantial evidence is emerging that shows frameworks on other languages, most notably Ruby on Rails, are several times as productive for solving niche problems. High-profile Java developers--James Duncan Davidson, Mike Clark, Justin Gehtland, Stuart Halloway, and many others--have reported very high productivity after using Rails in that important niche: greenfield database-backed web applications. Certainly, my private experience is that I can build, deploy and maintain such applications with far less effort using Ruby on Rails.

These reports will be broadly debated, just as the early reports of Java's productivity were. Remember, Java emerged first in a variety of niches before it expanded more broadly. Programmer productivity was one of the most important criteria driving Java's early growth. Keep in mind Moore's theory for the emergence of technologies. You'll best cross the chasm not with one giant leap, but one niche at a time.

I strongly believe that complexity and sagging productivity are driving Java's risks up now.

Inherent Ruby risks

Ruby is no different than any other emerging programming language. Lack of commercial investment, a limited pool of developers, and lack of experience all will add risk to an emerging language. Here are the biggest risks I've encountered.

  • Lack of talent. It's harder to find existing Ruby developers. As it did with Java, that fact will change quickly, but right now, if you need to build large teams in a short time, you're better off with an established market leader such as Java.
  • Lack of experience. Some LAMP languages have established track records. Google uses Python; many major .COMs use Perl or C. There's not yet a flagship account for Ruby that shows massive scalability, or complex enterprise integration. We just don't know if it can solve a certain class of problems.
  • Deployment and profiling strategies. Ruby on Rails has been out for less than a year, so deployment and profiling experience isn't nearly as rich as it is for competing languages.
  • Lack of libraries. Ruby does not have nearly as rich a set as libraries as Java.
  • Lack of commercial investment. You have to work harder to find Ruby consulting, education, or contractors, and off-shoring is practically nonexistent.

There are many others. Still, you can effectively mitigate risks associated with Ruby. Take performance-related risks. Though the body of knowledge around large-scale Ruby deployments is limited, you can learn if you look in the right places. The industry has a wealth of knowledge of other LAMP-languages such as PhP, Perl, and Python. The deployment mechanisms, web servers, and shared-nothing strategies for scalability are all similar.

Or consider staffing. Don't underestimate your ability to build an effective staff through internal training. My training schedule for new Java developers for Spring, Eclipse, Hibernate, and WebWork is effectively five times as long as a similar schedule for a Ruby on Rails developer. You can do well by starting with a programming language with characteristics similar to Ruby, such as Perl, Python, or Smalltalk. If you want to build a programmer from scratch, you'll probably build a productive Ruby developer at least as fast as you can train a Java developer how to use the latest bevy of frameworks.

And think about libraries. How much do you really need? If you need distributed, two-phased commit, use Java. If you need perfect integration into Microsoft Office macros, use .NET. But if you're building operating system scripts for integration, or greenfield database backed applications, Ruby will have just about everything you need. And you can often build what you need if it's not there. I work with one company that built their own database driver in two weeks, but more than made up that time over the rest of the project. I talked to another that extended Oracle support by patching existing code in four hours. Thoughtworks built RBatis, Ruby's version of iBATIS, in a very short time.

So Ruby's risks are often overstated when you consider the whole picture, especially if Java is not giving you everything you need. The best way to put these risks into perspective is often to try Ruby for yourself. Use Rails to build something nontrivial, and make a call based on what you find. Don't buy into the myths.

Myth versus reality

Rails is a silver bullet.

People have failed with Rails, and many more will fail. If you apply it without the requisite skills, you'll fail too.

On a similar note, if Java's not your problem, Ruby will not be the answer. Most software development problems are not related to technology. If you're thrashing, Ruby on Rails will only help you thrash faster.

Choosing Ruby is too risky, because you could guess wrong.

The primary risk of adopting any new language is that you'll guess wrong, and be left with a stagnated set of libraries. That's certainly a significant risk, but that problem is in no way limited to just Ruby. Within Java, you need to make potentially dozens of small decisions about major libraries, any of which can leave you with a struggling, stagnating code base. Should you pick Spring, or EJB 3 for declarative transactions? Is the Java Persistence Architecture the right choice, or is Hibernate ultimately the answer? What's the right answer for the Web MVC layer, a fading Struts, or something cleaner?

Within Ruby, choosing a web development framework is much easier. You'll likely be working with Rails. The dynamic nature of the language also makes it easier to decouple layers of the architecture, making certain decisions much less invasive than their Java counterparts.

It's always easier to staff a Java project.

Java does have a much larger pool of developers, but the community has significant fragmentation. If you want to use an integrated stack, your choices are limited. Even if you do choose a popular stack such as Spring, your developers must learn potentially dozens of libraries that are specific to a given project. In this case, Java's core strength, a plethora of libraries, works against it. In contrast, most Ruby developers know Rails. Also, you typically need more Java developers to handle a similar task. Sometimes, staffing for Java is easier. Sometimes, it's not.

Rails cannot scale.

Ruby on Rails actually has good scalability. The caching model is strong, and the shared-nothing architecture has proven effective dozens of times over within the LAMP community. In reality, we know that Ruby on Rails can scale to moderately large applications. We don't know at all whether or not Ruby on Rails can handle very large application deployments. Nothing inherent in the architecture leads me to believe that it is a dead end. For typical applications, the latency is in the database anyway.

Rails integration options are too limited.

Rails has very good support of ReST-based web services. Ruby also has emerging support for the JVM through JRuby and Microsoft's virtual machine, called the CLR in a separate project. Good messaging options are emerging as well. In the end, you'll be in good shape if you pick the best tool for the job. Good teams can succeed with either Java or Ruby.

Wrapping up: What actions can you take?

If you're considering using Ruby, there's a wealth of information at your fingertips. Talk to people who have done both Java and Ruby effectively. Read about the frameworks. Check out From Java to Ruby. If you don't think you can leave Java but want a lightweight development experience, check out the Java projects that give you better leverage, such as RIFE, JMatter, or Wicket. If you think Ruby might be a good choice, consider these suggestions:

  • Pick the right tool for the job. Ruby on Rails is not a silver bullet. It's a highly-tailored environment for database-backed web applications. It will work much better with new database schemas, or those you can modify to take advantage of Rails defaults.
  • Plan your team ramp-up carefully. You won't be able to throw out an ad on Monster.com and staff the project in three days. You might want to consider training some or all of your developers, and recruiting a few top Rails developers, or taking on some limited consulting help to jump-start things.
  • Know your legacy integration points. Often, the hardest part of a project is defining interactions with external systems. Your initial proof-of-concept work should work through some of these touch points, at least to the point where you're comfortable with your solutions.

If you're not sure, do a pilot, or go with the conservative option. The best risk mitigation is always good judgement.

About the author

Bruce Tate is a mountain biker, kayaker and father of two in Austin, Texas. He has written nine programming books, including two on Ruby and five on Java. He is the founder of RapidRed, a company with a focus on lightweight development technologies including Ruby and Rails, offering development, consulting, and training. Bruce is recognized worldwide as an excellent speaker, programmer, trainer, and consultant.

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

Correction of the posted article by Baron Davis

My apologies for the readers, but the title of the article was mispelled. It should say :

"How you, script kiddies, can play with building simple tiny CRUD web apps, and enjoying your ignorant life without clustering, legacy system integration, distributed caching, [put your favourite here]..."

It is quite amazing to see article after article that has the same error, but no correction from the original posters whatsoever.

Peace

Re: Correction of the posted article by Alex Popescu

The author clearly speaks about niche-markets. Also, we must agree that not all built apps are needing clustering, legacy system integration, etc. Now, I am wondering what is wrong with having different tools to build different solutions?

./alex
--
:Architect of InfoQ.com:
.w( the_mindstorm )p.

Re: Correction of the posted article by Floyd Marinescu

Baron - the article did not attempt to pitch Rails as a solution for large scale distributed systems so it's not fair to criticize it as though it was. It's a fact that Rails is optimal for green field web apps, as evidenced by the Java community attempting to emulate many of these features on it's own web frameworks.

If you're not interested in Ruby/Rails content then you feel free to turn the Ruby community off (see that little widget on the left side bar).

Floyd

Re: Correction of the posted article by Alexey Verkhovsky

There are ways to do clustering, legacy system integration and distributed caching with Ruby. And calling Bruce Tate a script kiddy is, well, misguided.

Re: Correction of the posted article by Baron Davis

Also, we must agree that not all built apps are needing clustering, legacy system integration, etc.

I agree 100% with you.

Now, I am wondering what is wrong with having different tools to build different solutions?

Actually I am really interested in Ruby and other technologies because they all reflect the direction in which software evolves, however it is quite annoying to regulary see comparison of RoR and Java, where
usually Java is mentioned in negative context.
I like to read articles of type: "RoR helps you solve the X problem", but dislike regular badmouthing of type "RoR vs entire Java environment".

It's a fact that Rails is optimal for green field web apps...

IMHO, than this should be stated clearly in the article. Otherwise you are deceiving public (where public in this case are young minds of young unexperienced developers who are still unable to create their own opinion on this subject).

If you're not interested in Ruby/Rails content then you feel free to turn the Ruby community off

As I said previously, I like to read news about Ruby/Rails, but dislike this type of articles, so it seems I am in no win situation.

Nothing personal guys, site is great, but considering that you are in this powerful position where you can affect direction, either good or bad, in which young creative developers will develop their skills, I felt obliged to speak out.

Peace

Re: Correction of the posted article by Baron Davis

Also, we must agree that not all built apps are needing clustering, legacy system integration, etc.

I agree 100% with you.

Now, I am wondering what is wrong with having different tools to build different solutions?

Actually I am really interested in Ruby and other technologies because they all reflect the direction in which software evolves, however it is quite annoying to regulary see comparison of RoR and Java, where
usually Java is mentioned in negative context.
I like to read articles of type: "RoR helps you solve the X problem", but dislike regular badmouthing of type "RoR vs entire Java environment".

It's a fact that Rails is optimal for green field web apps...

IMHO, than this should be stated clearly in the article. Otherwise you are deceiving public (where public in this case are young minds of young unexperienced developers who are still unable to create their own opinion on this subject).

If you're not interested in Ruby/Rails content then you feel free to turn the Ruby community off

As I said previously, I like to read news about Ruby/Rails, but dislike this type of articles, so it seems I am in no win situation.

Nothing personal guys, site is great, but considering that you are in this powerful position where you can affect direction, either good or bad, in which young creative developers will develop their skills, I felt obliged to speak out.

Peace

Re: Correction of the posted article by Alexey Verkhovsky


It's a fact that Rails is optimal for green field web apps...
IMHO, than this should be stated clearly in the article.


Perhaps, you underestimate what can be done with Ruby. The article, on the other hand, seems to state those limitations rather clearly.

Greenfield web apps are where the advantage is most noticeable, but I know people who built a Rails frontend to a legacy Oracle database designed by committee. Reportedly, hey saw productivity gains even in that project.

Re: Correction of the posted article by Alex Popescu

[...]however it is quite annoying to regulary see comparison of RoR and Java, where
usually Java is mentioned in negative context.


Well, I guess most of us Java developers feel quite the same. But, it is a well known marketting strategy to kick the giant of the moment in order to gain more attention. The good news is - and this is only my opinion - that Ruby is one of those things deserving attention.

./alex
--
:Architect of InfoQ.com:
.w( the_mindstorm )p.

Re: Correction of the posted article by Stefan Tilkov

Baron, it seems to me that your points are valid for many articles -- although hopefully for none of those published on InfoQ. No offense, but did you actually read this particular one? It does not seem to suffer from the script-kiddy-silver-bullet symptom IMO. But maybe I've missed the parts you allude to -- in this case, please point them out.

A language without a VM is sooooo 80s by Faui Gerzigerk

I think Bruce is right to focus on productivity as the factor that makes you want to cross the chasm. But where is that productivity if you have to drop back to C for anything more complex than CRUD? CRUD is not that first niche from which a language can go on to conquer the world. It's one of the few things a language 10 times slower than Java or C++ can ever do because CRUD means delegating all the real work to the DBMS and the operating system.

And no, it's not simply a matter of moving your work to JRuby or some CLR based Ruby implementation. Its creators started Ruby at a time when there were no mainstream VMs and they didn't create one either. And that means that Ruby (and Python and Perl) is stuck on that antiquated platform that is C. The implementors of JRuby have to port all the C based stuff to Java and they will always lag behind. It's a brain dead duplication of effort. More and more libraries are created for C based Ruby and there's no way anyone can keep up with porting that stuff to the JVM. C based Ruby becomes more entrenched every day and that means everyone doing anything other than CRUD has code it in C as well.

Are we really moving our infrastructures back to C just because we want to use some dynamically typed scripting language for the CRUD part? I don't think so. It's not that Ruby is too risky. It's that the Ruby environment as a whole is unproductive because it is based on an antiquated platform model.

Yes it's true, J2EE is unwieldy and bloated. Its development and deployment model runs counter to anything remotely agile and the latter hasn't changed in JEE 5 either. But does that mean we should roll back the real progress that has been achieved with JIT accelerated platform independent virtual machines? I think not. It would be a step back in time.

Joel on Ruby by J Aaron Farr

Funny, today Joel commented on a related subject:

"... and a handful of platforms where The Jury Is Not In, So Why Take The Risk When Your Job Is On The Line? (Ruby on Rails). Before you flame me, Ruby is a beautiful language and I'm sure you can have a lot of fun developing apps it in, and in fact if you want to do something non-mission-critical, I'm sure you'll have a lot of fun, but for Serious Business Stuff you really must recognize that there just isn't a lot of experience in the world building big mission critical web systems in Ruby on Rails, and I'm really not sure that you won't hit scaling problems, or problems interfacing with some old legacy thingamabob, or problems finding programmers who can understand the code, or whatnot..."

www.joelonsoftware.com/items/2006/09/01.html

Wow, Bruce is still going after Java by Marc Stock

He's being more sly about it now but I can't believe that people are still seriously comparing Ruby to Java. People who wanted to whip out quick web apps have always had the option of using languages like PHP and Python. How often do you see posts of this nature regarding Python vs. Java? Almost never.

ruby risk by Rusty Wright

I agree that Bruce glosses over with a very wide brush the advantages of Java, and likewise glosses over the disadvantages of Ruby. I happen to think that Rails is a great framework; for me it's Ruby that's the problem. I can't stop thinking that a language that uses run time type checking is simply less safe than one that uses complile time type checking. Regardless of whether it's a small team or even just one person, versus a large team working on a large project.

Re: ruby risk by Doctor Gonzo

Re: ruby risk by Bruce Tate

It's a risk/reward equation, no? Before automated testing, Java wins. Since automated testing, the reward for static typing is not so great.

I agree that Bruce glosses over with a very wide brush the advantages of Java, and likewise glosses over the disadvantages of Ruby. I happen to think that Rails is a great framework; for me it's Ruby that's the problem. I can't stop thinking that a language that uses run time type checking is simply less safe than one that uses complile time type checking. Regardless of whether it's a small team or even just one person, versus a large team working on a large project.


Ah...that's a pretty broad brush you're using. One size fits all. So...do you use

- Spring to delay binding until run time
- XML to configure anything, which also allows runtime type checking
- Byte code enhancement and reflection frameworks that use run time type checking, within the context of persistence frameworks?

So you use delayed binding and run time checking all of the time. You just don't get all of the benefits of a language that makes it easy to do the same.

Ruby is not for everyone. I never said it was. But this size definitely fits the apps I've been building lately...like a glove.

RoR is not JUST for greenfield apps by Steven Talcott Smith

I just completed a total rewrite of a LAMP system in RoR. I re-created a year and a half worth of part time development in a little under 3 months of part time Ruby. (Although the last few weeks going into production were full time.) As part of the re-write, I developed a lot of new features and COMPLETELY REFACTORED MY DATABASE MODEL -- with over 75 tables.

I have never in my entire 15 year career contemplated completely redesigning the database of a running, production application. Add a table here and there but never, ever perform major surgery. This was only possible due to a feature of Rails: Migrations. I am sure there are good tools out there to version your data model but I have come across nothing so sweetly integrated with your code base as migrations. I am one of those "Technologist/Early Adopters." Call me crazy now but I jumped on Java in '96. This is all that and more. And it's just plain fun.

I knew nothing about Ruby before starting and I never wrote a demo or greenfield application on it. I figured it would be more instructive to take an application that I actually understood and re-do it. Your milage may vary.

If you are a big self starter and your productivity is truely important (ie. your time is highly valued) -- consider RoR. If you tend to require formal training, you probably should wait till it is more widely adopted.

I have a nose for things and I suspect there will be good money in RoR consulting for the early birds...

Re: ruby risk by Bill de hÓra

" I can't stop thinking that a language that uses run time type checking is simply less safe than one that uses complile time type checking. "

Not really. The problem is comprehensibility, especially when it comes to mixins. It'll be interesting to see how Rails manages interface creep over the next few years. Systems that allow unfettered interface and function binding can be very hard to understand and work with in the whole (cf Zope 2). In java mucking about with classes and interfaces is limited to framework infrastructure. In Rails this is very much exposed to the app developer. That's touted as an advantage today, but it's not free. I expect Bitter Ruby will have a chapter called 100 Method Object.

Re: ruby risk by Rusty Wright

It's a risk/reward equation, no? Before automated testing, Java wins. Since automated testing, the reward for static typing is not so great.

This assumes that your testing exercises every line of your code and tests all possible permutations of its behaviour. I would guess that that's not very likely.

Ah...that's a pretty broad brush you're using. One size fits all.

So are you saying that having a tool that checks and verifies the type correctness of your code is something that only some people need or want? I'm not talking about any specific language here. Simply that it's safer to have something verify the code's correctness with respect to types.

So...do you use

- Spring to delay binding until run time
- XML to configure anything, which also allows runtime type checking
- Byte code enhancement and reflection frameworks that use run time type checking, within the context of persistence frameworks?

So you use delayed binding and run time checking all of the time. You just don't get all of the benefits of a language that makes it easy to do the same.

You're portraying it as Java vs. Ruby. For me it's the broader question of having something do type checking before I run the program. I want more type checking, and stricter type checking. I'm not saying that Java is the best language; something like Nice (nice.sourceforge.net/) would be better. I want as much help as possible from my tools to find potential and real errors in my programs before I run them.

Re: ruby risk by Rusty Wright

The problem is comprehensibility, especially when it comes to mixins. ... Systems that allow unfettered interface and function binding can be very hard to understand and work with in the whole (cf Zope 2). ... In Rails this is very much exposed to the app developer. That's touted as an advantage today, but it's not free. I expect Bitter Ruby will have a chapter called 100 Method Object.

opal.cabochon.com/~stevey/blog-rants/digging-in...

Re: ruby risk by Bruce Tate

I completely see where you're coming from. I don't think that Rails is perfect. In some ways, it's too accessable. The problem is that we're seeking the same sorts of Rails-like capabilities with Java right now. The complexity comes in a different form, but it still comes. You talk about the 100 method object; what about the 100-class framework to do the same?

Face it: Spring, AOP, heavy reflection, annotations, byte code enhancement, servlets, deployment descriptors, xml configuration...all of these delayed binding programming techniques come with intellectual baggage of their own.

The question is whether you want to be direct about it, at a limited cost, or indirect, at a far greater cost.

The problem is comprehensibility, especially when it comes to mixins. ... Systems that allow unfettered interface and function binding can be very hard to understand and work with in the whole (cf Zope 2).

There are better technologies the Ruby on Rails by Ulrich Weber

Sorry for Bruce Tate but the future of webtechnology (for us Java-developers!) might not be Ruby on Rails. The future might be GWT, Google's webtoolkit.

RoR is a pur serverside-framework whereas GWT is a complete client-side and server-agnostic webframework that helps us to build rich web-content in future. Does RoR help us to build rich content say Ajax-RIA's ? I don't think that a pur serverside-framework can provide those facilities. GWT does.

Furthermore RJS (Ruby-Javascript-templates) cannot provdide what GWT can. Being a clientside webframework and being server-agnostic is a very important point too: you can easily connect all kinds of so-called legacy applications like Struts to a GWT-frontend. Being server-agnostic allows us to use existing PHP-, Perl-, Python- etc. infrastructures such as Wikis, CRM's, forums etc.

Re: ruby risk by Alexandre Poitras

Great post. Couldn't have said it better.

Re: Wow, Bruce is still going after Java by Robert Dean

He's being more sly about it now but I can't believe that people are still seriously comparing Ruby to Java. People who wanted to whip out quick web apps have always had the option of using languages like PHP and Python. How often do you see posts of this nature regarding Python vs. Java? Almost never.


You took the words out of my mouth. The thing that really turns me off is that Java is singled out for comparison when the usual methodology is to make broad comparisons (to the broad spectrum of general-purpose web platforms: not only Java, but .net, Python, PHP, perl, ColdFusion, and others as well). This sort of Burger King/Pepsi (cut down the "market leader" but not the others) type of marketing works for some people, but not all.

The McDonald's/Coke type of marketing (brand awareness) would work better for Ruby. It's certainly worthy of attention and doesn't need to expend entire books to comparing itself to Java.

Re: Wow, Bruce is still going after Java by Rusty Wright

The thing that really turns me off is that Java is singled out for comparison when the usual methodology is to make broad comparisons ... This sort of Burger King/Pepsi (cut down the "market leader" but not the others) type of marketing works for some people, but not all.
I think it's an aspect of the strength of Bruce's feelings about these things. Previously he was evangelizing Spring and Java, now it's Ruby and RoR.

Software Needs Philosophers (not religious fanatics).

Re: A language without a VM is sooooo 80s by Charles Nutter

I beg to differ! In fact, we only had to port a few core libraries like YAML support, ZLib, and Sockets to get Rails running under JRuby. In 95% of cases, Ruby libraries are written in pure Ruby. In the few cases where a port is necessary, it's almost always just a matter of wrapping existing functionality in Java. For example, Java has very good ZLib and Socket support, so we mostly just had to wrap those existing features.

JRuby already can run Rails in many scenarios, and we're improving compatibility and performance day by day. Almost all C-based extension writers we've talked to are also very interested in having Java-based versions of their code. A notable supporter is Zed Shaw, creator of Mongrel, who has been very helpful in our efforts to bring Mongrel over to JRuby. Not only will JRuby be able to keep up...we'll be able to innovate in many cases. That means more Ruby for everyone!

Re: Correction of the posted article by Oyku Gencay

Baron,
Your attitude can easily be classified by reading this very article. You either should have read before posting or articulate the points.

Re: RoR is not JUST for greenfield apps by Cedric Beust

I just completed a total rewrite of a LAMP system in RoR. I re-created a year and a half worth of part time development in a little under 3 months of part time Ruby.

Assume you spent 18 months on a RoR project and you decide to rewrite it in Java, don't you think you will do so in much less time than 18 months?

--
Cedric

Re: ruby risk by Cedric Beust

It's a risk/reward equation, no? Before automated testing, Java wins. Since automated testing, the reward for static typing is not so great.

There is a big difference: the compiler forces you to obey static typing. There is nothing that forces you to test your code.

Which is exactly why applications written with dynamically typed languages are more fragile than those written with statically typed languages.

If you can put together a group of superstar programmers, there is no doubt they will do an outstanding job in Ruby on Rails. Or Ruby. Or Java. Or any language, for that matter. That's why they are superstars.

When your team is made of regular programmers, dynamically typed languages are much more of a liability than something like Java.

--
Cedric

Re: ruby risk by Ian Nelson

Do you have any evidence or can you point to any studies that show testing closing the reliability gap with weakly typed systems vs strongly typed ones? It sounds kind of reasonable but there is a ton of evidence, pretty much as long as software engineering has been around that shows a substantial advantage to having compile time checks.


Automated testing is not a terribly new concept, it is also something that has been going on since the 60's or even earlier.

Re: Wow, Bruce is still going after Java by Alexey Verkhovsky

[Ruby is] certainly worthy of attention and doesn't need to expend entire books to comparing itself to Java.


Being a disruptive technology, Ruby does need that, too.

There are people making decisions along the lines of "we are a Java shop now, is it a good idea for us to go Ruby today?". The answer may be "yes", "not today", or "not ever".

As a developer, I want to do Ruby work. As an IT consultant, I don't want to push Ruby where it's not a good choice. And in my world it's practically always Ruby vs. Java vs. C#. So, what Bruce is writing about is quite relevant.

Re: RoR is not JUST for greenfield apps by Steven Talcott Smith

I just completed a total rewrite of a LAMP system in RoR. I re-created a year and a half worth of part time development in a little under 3 months of part time Ruby.

Assume you spent 18 months on a RoR project and you decide to rewrite it in Java, don't you think you will do so in much less time than 18 months?

--
Cedric

Perhaps. A rewrite between comparable languages should take less time because you are not learning the domain model at the same time. However, I shudder to think about the task of rewriting 18 months of RoR in Java. From my brief brush with Python, I would say a rewrite in Python would be easier to contemplate.

The lines of code multiple with respect to Java alone would be enough to intimidate me out of that particular path. I prefer to work at as high a level as possible.

It is rare that one gets the opportunity to completely rewrite anything. If I were not also the head of the company, I would probably not have had the go-ahead. Even so, with my businessman hat on, I had my doubts, at least until I got into production.

Steven

Re: RoR is not JUST for greenfield apps by Cedric Beust


Perhaps. A rewrite between comparable languages should take less time because you are not learning the domain model at the same time. However, I shudder to think about the task of rewriting 18 months of RoR in Java.

A lot of this time might have been dedicated to refactoring and getting the model right, so you're definitely not looking at 18 months of new code, which is why the testimonies claiming that they rewrote an application with language X or framework Y in ten times less time are so preposterous.


From my brief brush with Python, I would say a rewrite in Python would be easier to contemplate.

Probably, yes.


The lines of code multiple with respect to Java alone would be enough to intimidate me out of that particular path. I prefer to work at as high a level as possible.

So do I, but just because it takes more lines of code to do Web applications in Java than in Ruby doesn't mean that it takes more time to do so.

--
Cedric
testng.org

Re: RoR is not JUST for greenfield apps by karan malhi


So do I, but just because it takes more lines of code to do Web applications in Java than in Ruby doesn't mean that it takes more time to do so.


+1

Re: Wow, Bruce is still going after Java by Robert Dean

And in my world it's practically always Ruby vs. Java vs. C#.


Yes, but the books are Ruby vs. Java. Where's the C# or PHP comparison? That is the core of my point.

My 2 cents on Ruby vs C# and PHP. by Alexey Verkhovsky

Re C#, as the article says "I'd throw C# into the mix as effectively a Java clone, though there's room for some argument."

That's my humble (but well informed) opinion as well. There are differences between C# and Java, but for the purpose of comparing them both to Ruby, those differences are not significant enough.

I know just enough PHP to say that the leap from PHP to Ruby is not that big. You are still on the LAMP stack with dynamic typing and interpreted code. In this genre, Ruby is an obviously better general-purpose programming language than PHP, and (from what I hear) Rails is an obviously better MVC framework than any in the PHP world.

Having said that, there are still valid reasons to stay with PHP in certain situations, even so the decision should not be too hard to make.

Re: RoR is not JUST for greenfield apps by Alexey Verkhovsky


just because it takes more lines of code to do Web applications in Java than in Ruby doesn't mean that it takes more time to do so.


Not necessarily, although well-written Ruby code tends to be easier to read than well-written Java code, partly because it is shorter, and it is shorter partly because the basic constructs are on average more abstract.

There are many anecdotal accounts of Ruby's better productivity, across a fairly wide range of projects. There are also almost no anecdotal accounts to contradict it.

Just recently, we (ThoughtWorks Canada) prepared a bid for some project that could be done with either ASP.NET or Rails. Rails estimate was about 30% lower. No, it's not 5 times :) But keep in mind that a software project is not just development, there is business analysis, system testing, infrastructure, deployment, project management, and so on, and so forth, all included in the estimate. It's not a three tables, five screens CRUD app, either.

What is risk defined as? by Ian Nelson

Risk that the project cannot be built? Risk that the project cannot be sold? Risk that the product has low quality? Risk that the product cannot perform? Risk that you're doing something different than everyone else? Risk of being ashamed for doing something different?

Market share doesn't really reduce some of those risks or really have anything at all to do with them.. Which risk does "language bloat" impact? Can you back that up with any evidence? Doesn't that contradict your thesis because Ruby has a lot more language features than Java does? While Java has a lot more libraries, is it the libraries that cause the bloat? I thought that as market share increased the increased access to libraries/gems and that reduced risk?

Is a new programming language inherently risky? It either works or doesn't, right? The risk is just how much you might have to do to make your project successful with it. You might have to write your own implementation of it, for example, like JRuby.

Seriously, this article isn't even consistent within itself. Is this the kind of crap InfoQ is really about? Bruce, you have anything to add? An apology? Or maybe take your name off it or something? Or maybe you want to take all the risk stuff out and just talk about how great Ruby and Rails is or something?

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

37 Discuss

Educational Content

General Feedback
Bugs
Advertising
Editorial
InfoQ.com and all content copyright © 2006-2014 C4Media Inc. InfoQ.com hosted at Contegix, the best ISP we've ever worked with.
Privacy policy
BT