What is the best way of implementing singleton in Python Ask Question

What is the best way of implementing singleton in Python Ask Question

This question is not for the discussion of whether or not the singleton design pattern is desirable, is an anti-pattern, or for any religious wars, but to discuss how this pattern is best implemented in Python in such a way that is most pythonic. In this instance I define 'most pythonic' to mean that it follows the 'principle of least astonishment'.

I have multiple classes which would become singletons (my use-case is for a logger, but this is not important). I do not wish to clutter several classes with added gumph when I can simply inherit or decorate.

Best methods:


Method 1: A decorator

def singleton(class_):
    instances = {}
    def getinstance(*args, **kwargs):
        if class_ not in instances:
            instances[class_] = class_(*args, **kwargs)
        return instances[class_]
    return getinstance

@singleton
class MyClass(BaseClass):
    pass

Pros

  • Decorators are additive in a way that is often more intuitive than multiple inheritance.

Cons

  • While objects created using MyClass() would be true singleton objects, MyClass itself is a function, not a class, so you cannot call class methods from it. Also for

    x = MyClass();
    y = MyClass();
    t = type(n)();
    

then x == y but x != t && y != t


Method 2: A base class

class Singleton(object):
    _instance = None
    def __new__(class_, *args, **kwargs):
        if not isinstance(class_._instance, class_):
            class_._instance = object.__new__(class_, *args, **kwargs)
        return class_._instance

class MyClass(Singleton, BaseClass):
    pass

Pros

  • It's a true class

Cons

  • Multiple inheritance - eugh! __new__ could be overwritten during inheritance from a second base class? One has to think more than is necessary.

Method 3: A metaclass

class Singleton(type):
    _instances = {}
    def __call__(cls, *args, **kwargs):
        if cls not in cls._instances:
            cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
        return cls._instances[cls]

#Python2
class MyClass(BaseClass):
    __metaclass__ = Singleton

#Python3
class MyClass(BaseClass, metaclass=Singleton):
    pass

Pros

  • It's a true class
  • Auto-magically covers inheritance
  • Uses __metaclass__ for its proper purpose (and made me aware of it)

Cons

  • Are there any?

Method 4: decorator returning a class with the same name

def singleton(class_):
    class class_w(class_):
        _instance = None
        def __new__(class_, *args, **kwargs):
            if class_w._instance is None:
                class_w._instance = super(class_w,
                                    class_).__new__(class_,
                                                    *args,
                                                    **kwargs)
                class_w._instance._sealed = False
            return class_w._instance
        def __init__(self, *args, **kwargs):
            if self._sealed:
                return
            super(class_w, self).__init__(*args, **kwargs)
            self._sealed = True
    class_w.__name__ = class_.__name__
    return class_w

@singleton
class MyClass(BaseClass):
    pass

Pros

  • It's a true class
  • Auto-magically covers inheritance

Cons

  • Is there not an overhead for creating each new class? Here we are creating two classes for each class we wish to make a singleton. While this is fine in my case, I worry that this might not scale. Of course there is a matter of debate as to whether it aught to be too easy to scale this pattern...
  • What is the point of the _sealed attribute
  • Can't call methods of the same name on base classes using super() because they will recurse. This means you can't customize __new__ and can't subclass a class that needs you to call up to __init__.

Method 5: a module

a module file singleton.py

Pros

  • Simple is better than complex

Cons

  • Not lazily instantiated

ベストアンサー1

Use a Metaclass

I would recommend Method #2, but you're better off using a metaclass than a base class. Here is a sample implementation:

class Singleton(type):
    _instances = {}
    def __call__(cls, *args, **kwargs):
        if cls not in cls._instances:
            cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
        return cls._instances[cls]
        
class Logger(object):
    __metaclass__ = Singleton

Or in Python3

class Logger(metaclass=Singleton):
    pass

If you want to run __init__ every time the class is called, add

        else:
            cls._instances[cls].__init__(*args, **kwargs)

to the if statement in Singleton.__call__.

A few words about metaclasses. A metaclass is the class of a class; that is, a class is an instance of its metaclass. You find the metaclass of an object in Python with type(obj). Normal new-style classes are of type type. Logger in the code above will be of type class 'your_module.Singleton', just as the (only) instance of Logger will be of type class 'your_module.Logger'. When you call logger with Logger(), Python first asks the metaclass of Logger, Singleton, what to do, allowing instance creation to be pre-empted. This process is the same as Python asking a class what to do by calling __getattr__ when you reference one of its attributes by doing myclass.attribute.

A metaclass essentially decides what the definition of a class means and how to implement that definition. See for example http://code.activestate.com/recipes/498149/, which essentially recreates C-style structs in Python using metaclasses. The thread What are some (concrete) use-cases for metaclasses? also provides some examples, they generally seem to be related to declarative programming, especially as used in ORMs.

In this situation, if you use your Method #2, and a subclass defines a __new__ method, it will be executed every time you call SubClassOfSingleton() -- because it is responsible for calling the method that returns the stored instance. With a metaclass, it will only be called once, when the only instance is created. You want to customize what it means to call the class, which is decided by its type.

In general, it makes sense to use a metaclass to implement a singleton. A singleton is special because is created only once, and a metaclass is the way you customize the creation of a class. Using a metaclass gives you more control in case you need to customize the singleton class definitions in other ways.

Your singletons won't need multiple inheritance (because the metaclass is not a base class), but for subclasses of the created class that use multiple inheritance, you need to make sure the singleton class is the first / leftmost one with a metaclass that redefines __call__ This is very unlikely to be an issue. The instance dict is not in the instance's namespace so it won't accidentally overwrite it.

You will also hear that the singleton pattern violates the "Single Responsibility Principle" -- each class should do only one thing. That way you don't have to worry about messing up one thing the code does if you need to change another, because they are separate and encapsulated. The metaclass implementation passes this test. The metaclass is responsible for enforcing the pattern and the created class and subclasses need not be aware that they are singletons. Method #1 fails this test, as you noted with "MyClass itself is a a function, not a class, so you cannot call class methods from it."

Python 2 and 3 Compatible Version

Writing something that works in both Python2 and 3 requires using a slightly more complicated scheme. Since metaclasses are usually subclasses of type type, it's possible to use one to dynamically create an intermediary base class at run time with it as its metaclass and then use that as the baseclass of the public Singleton base class. It's harder to explain than to do, as illustrated next:

# works in Python 2 & 3
class _Singleton(type):
    """ A metaclass that creates a Singleton base class when called. """
    _instances = {}
    def __call__(cls, *args, **kwargs):
        if cls not in cls._instances:
            cls._instances[cls] = super(_Singleton, cls).__call__(*args, **kwargs)
        return cls._instances[cls]

class Singleton(_Singleton('SingletonMeta', (object,), {})): pass

class Logger(Singleton):
    pass

An ironic aspect of this approach is that it's using subclassing to implement a metaclass. One possible advantage is that, unlike with a pure metaclass, isinstance(inst, Singleton) will return True.

Corrections

On another topic, you've probably already noticed this, but the base class implementation in your original post is wrong. _instances needs to be referenced on the class, you need to use super() or you're recursing, and __new__ is actually a static method that you have to pass the class to, not a class method, as the actual class hasn't been created yet when it is called. All of these things will be true for a metaclass implementation as well.

class Singleton(object):
  _instances = {}
  def __new__(class_, *args, **kwargs):
    if class_ not in class_._instances:
        class_._instances[class_] = super(Singleton, class_).__new__(class_, *args, **kwargs)
    return class_._instances[class_]

class MyClass(Singleton):
  pass

c = MyClass()

Decorator Returning A Class

I originally was writing a comment but it was too long, so I'll add this here. Method #4 is better than the other decorator version, but it's more code than needed for a singleton, and it's not as clear what it does.

The main problems stem from the class being its own base class. First, isn't it weird to have a class be a subclass of a nearly identical class with the same name that exists only in its __class__ attribute? This also means that you can't define any methods that call the method of the same name on their base class with super() because they will recurse. This means your class can't customize __new__, and can't derive from any classes that need __init__ called on them.

When to use the singleton pattern

Your use case is one of the better examples of wanting to use a singleton. You say in one of the comments "To me logging has always seemed a natural candidate for Singletons." You're absolutely right.

When people say singletons are bad, the most common reason is they are implicit shared state. While with global variables and top-level module imports are explicit shared state, other objects that are passed around are generally instantiated. This is a good point, with two exceptions.

The first, and one that gets mentioned in various places, is when the singletons are constant. Use of global constants, especially enums, is widely accepted, and considered sane because no matter what, none of the users can mess them up for any other user. This is equally true for a constant singleton.

The second exception, which get mentioned less, is the opposite -- when the singleton is only a data sink, not a data source (directly or indirectly). This is why loggers feel like a "natural" use for singletons. As the various users are not changing the loggers in ways other users will care about, there is not really shared state. This negates the primary argument against the singleton pattern, and makes them a reasonable choice because of their ease of use for the task.

Here is a quote from http://googletesting.blogspot.com/2008/08/root-cause-of-singletons.html:

Now, there is one kind of Singleton which is OK. That is a singleton where all of the reachable objects are immutable. If all objects are immutable than Singleton has no global state, as everything is constant. But it is so easy to turn this kind of singleton into mutable one, it is very slippery slope. Therefore, I am against these Singletons too, not because they are bad, but because it is very easy for them to go bad. (As a side note Java enumeration are just these kind of singletons. As long as you don't put state into your enumeration you are OK, so please don't.)

The other kind of Singletons, which are semi-acceptable are those which don't effect the execution of your code, They have no "side effects". Logging is perfect example. It is loaded with Singletons and global state. It is acceptable (as in it will not hurt you) because your application does not behave any different whether or not a given logger is enabled. The information here flows one way: From your application into the logger. Even thought loggers are global state since no information flows from loggers into your application, loggers are acceptable. You should still inject your logger if you want your test to assert that something is getting logged, but in general Loggers are not harmful despite being full of state.

おすすめ記事