tags:

views:

387

answers:

8

I recently installed a module from CPAN and noticed one of its dependencies was common::sense, a module that offers to enable all the warnings you want, and none that you don't. From the module's synopsis:

use common::sense;

# supposed to be the same, with much lower memory usage, as:
#
# use strict qw(vars subs);
# use feature qw(say state switch);
# no warnings;
# use warnings qw(FATAL closed threads internal debugging pack substr malloc
#                 unopened portable prototype inplace io pipe unpack regexp
#                 deprecated exiting glob digit printf utf8 layer
#                 reserved parenthesis taint closure semicolon);
# no warnings qw(exec newline);

Save for undef warnings sometimes being a hassle, I've usually found the standard warnings to be good. Is it worth switching to common::sense instead of my normal use strict; use warnings;?

+7  A: 

Not everyone's idea of common sense is the same - in that respect it's anything but common.

Go with what you know. If you get undef warnings, chances are that your program or its input is incorrect.

Warnings are there for a reason. Anything that reduces them cannot be useful. (I always compile with gcc -Wall too...)

Alnitak
Don't forget to use -Wextra too :D
Jonathan Leffler
Undef warnings usually just mean I've forgotten to quality some string comparison with `defined $foo )
Dan
Or you've gotten a hash key wrong, or a function that you thought *couldn't* return undef has returned undef, or you have a hole in your control flow... Explicitly checking definedness in a few prices is an acceptable price to pay for a little help in detecting those *other* mistakes :)
hobbs
+4  A: 

I have never had a warning that wasn't something dodgy/just plain wrong in my code. For me, it's always something technically allowed that I almost certainly don't want to do. I think the full suite of warnings is invaluable. If you find use strict + use warnings adequate for now, I don't see why you'd want to change to using a non-standard module which is then a dependency for every piece of code you write from here on out...

ire_and_curses
+2  A: 

When it comes to warnings, I support the use of any module or built-in language feature that gives you the level of warnings that helps you make your code as solid and reliable as it can possibly be. An ignored warning is not helpful to anyone.

But if you're cozy with the standard warnings, stick with it. Coding to a stricter standard is great if you're used to it! I wouldn't recommend switching just for the memory savings. Only switch if the module helps you turn your code around quicker and with more confidence.

joealba
+5  A: 

The "lower memory usage" only works if you use no modules that load strict, feature, warnings, etc. and the "much" part is...not all that much.

ysth
A couple hundred kB, which is honestly more than some people probably *expect*, but it's still a drop in the bucket. It's almost all shareable memory, too. Not "shared" like the text sections of libraries, but if something uses warnings and then forks, basically none of that 400kB or so will ever get COWed.
hobbs
(read that as "it won't get written and therefore won't get copied" -- that is, the data *is* COW :)
hobbs
+9  A: 

I would say stick with warnings and strict for two main reasons.

  1. If other people are going to use or work with your code, they are (almost certainly) used to warnings and strict and their rules. Those represent a community norm that you and other people you work with can count on.
  2. Even if this or that specific piece of code is just for you, you probably don't want to worry about remembering "Is this the project where I adhere to warnings and strict or the one where I hew to common::sense?" Moving back and forth between the two modes will just confuse you.
Telemachus
+1, minimise cognitive load whenever possible. Also, if there is a particular warning/strict violation that's bugging you but you know the code is doing the right thing, isolate the smallest piece of code that triggers it inside a block and put "no warnings;" or "no strict;" inside at the top.
j_random_hacker
+6  A: 

I obviously have no common sense because I going more for Modern::Perl ;-)

/I3az/

draegtun
+1 for your craziness.
innaM
+11  A: 

While I like the idea of reducing boiler-plate code, I am deeply suspicious of tools like Modern::Perl and common::sense.

The problem I have with modules like this is that they bundle up a group of behaviors and hide behid glib names with changeable meanings.

For example, Modern::Perl today consists of enabling some perl 5.10 features and using strict and warnings. But what happens when Perl 5.12 or 5.14 or 5.24 come out with great new goodies, and the community discovers that we need to use the frobnitz pragma everywhere? Will Modern::Perl provide a consistent set of behaviors or will it remain "Modern". If MP keeps with the times, it will break existing systems that don't keep lock-step with its compiler requirements. It adds extra compatibility testing to upgrade. At least that's my reaction to MP. I'll be the first to admit that chromatic is about 10 times smarter than me and a better programmer as well--but I still disagree with his judgment on this issue.

common::sense has a name problem, too. Whose idea of common sense is involved? Will it change over time?

My preference would be for a module that makes it easy for me to create my own set of standard modules, and even create groups of related modules/pragmas for specific tasks (like date time manipulation, database interaction, html parsing, etc).

I like the idea of Toolkit, but it sucks for several reasons: it uses source filters, and the macro system is overly complex and fragile. I have the utmost respect for Damian Conway, and he produces brilliant code, but sometimes he goes a bit too far (at least for production use, experimentation is good).

I haven't lost enough time typing use strict; use warnings; to feel the need to create my own standard import module. If I felt a strong need for automatically loading a set of modules/pragmas, something similar to Toolkit that allows one to create standard feature groups would be ideal:

use My::Tools qw( standard datetime SQLite );

or

use My::Tools;
use My::Tools::DateTime;
use My::Tools::SQLite;

Toolkit comes very close to my ideal. Its fatal defects are a bummer.

As for whether the choice of pragmas makes sense, that's a matter of taste. I'd rather use the occasional no strict 'foo' or no warnings 'bar' in a block where I need the ability to do something that requires it, than disable the checks over my entire file. Plus, IMO, memory consumption is a red herring. YMMV.

update

It seems that there are many (how many?) different modules of this type floating around CPAN.

  • There is latest, which is no longer the latest. Demonstrates part of the naming problem.
  • Also, uni::perl which adds enabling unicode part of the mix.
  • ToolSet offers a subset of Toolkit's abilities, but without source filters.
  • I'll include Moose here, since it automatically adds strict and warnings to the calling package.
  • And finally Acme::Very::Modern::Perl

The proliferation of these modules and the potential for overlapping requirements, adds another issue.

What happens if you write code like:

use Moose;
use common::sense;

What pragmas are enabled with what options?

daotoad
Modern::Perl is going to allow you to make it explicit as to which version of "modern" you're buying in to. Something like "use Modern::Perl '2009'" or some such.
mpeters
Good grief! More documentation to read when all I was trying to say a couple of years ago was "use strict; use warnings;"?
innaM
@mpeters, so to understand a given MP invocation, I'll have to be a Perl historian? No thanks. ++ for the info, though.
daotoad
Excellent points, too often people don't think about the future implications of their choices of names. mpeters is right though -- the key to avoiding total chaos is (some form of) versioning. Which does make it harder to understand a given invocation, but that's better than having an unversioned module with semantics that change right from under you. That said, a name that is a succinct functional description would be better, and an easy way to combine your own arbitrary groups of modules better still.
j_random_hacker
even if modern perl one day changes and breaks your code, its no different from all the other modules out there which can incur a 100% API change, become installed, and break your code. If you're a CPAN author, and you depend on stuff, be prepared to have to fix it one day if your dep changes, its life.
Kent Fredric
@Kent, the problem is that what constitutes "modern" Perl practice is guaranteed to change. Also, when Perl 5.12 comes out and MP starts requiring a new feature in 5.12 as its default, then I have to go and fix all my existing, working code to use Modern::Perl::2009 or 2010 or something. Over time the situation gets worse. All these problems exist in any dependency, but other dependencies don't PROMISE to break over time.
daotoad
The problem doesn't get worse if you `use Modern::Perl '2009-10'` *today* :)
hobbs
@hobbs, too bad `use Modern::Perl '2009-10'` is not documented as part of the API, *today* :) http://search.cpan.org/dist/Modern-Perl-1.03/lib/Modern/Perl.pm
daotoad
+3  A: 

There is one bit nobody else seems to have picked up on, and that's FATAL in the warnings list.

So as of 2.0, use common::sense is more akin to:

use strict; 
use warnings FATAL => 'all'; # but with the specific list of fatals instead of 'all' that is

This is a somewhat important and frequently overlooked feature of warnings that ramps the strictness a whole degree higher. Instead of undef string interpolation, or infinite recursion just warning you and then keeping on going despite the problem, it actually halts.

To me this is helpful, because in many cases, undef string interpolation leads to further more dangerous errors, which may go silently unnoticed, and failing and bailing is a good thing.

Kent Fredric