Access django models inside of Scrapy

Is it possible to access my django models inside of a Scrapy pipeline, so that I can save my scraped data straight to my model?

I've seen this, but I don't really get how to set it up?


If anyone else is having the same problem, this is how I solved it.

I added this to my scrapy settings.py file:

def setup_django_env(path):
    import imp, os
    from django.core.management import setup_environ

    f, filename, desc = imp.find_module('settings', [path])
    project = imp.load_module('settings', f, filename, desc)       

    setup_environ(project)

setup_django_env('/path/to/django/project/')

Note: the path above is to your django project folder, not the settings.py file.

Now you will have full access to your django models inside of your scrapy project.


The opposite solution (setup scrapy in a django management command):

# -*- coding: utf-8 -*-
# myapp/management/commands/scrapy.py 

from __future__ import absolute_import
from django.core.management.base import BaseCommand

class Command(BaseCommand):

    def run_from_argv(self, argv):
        self._argv = argv
        self.execute()

    def handle(self, *args, **options):
        from scrapy.cmdline import execute
        execute(self._argv[1:])

and in django's settings.py:

import os
os.environ['SCRAPY_SETTINGS_MODULE'] = 'scrapy_project.settings'

Then instead of scrapy foo run ./manage.py scrapy foo.

UPD: fixed the code to bypass django's options parsing.


Add DJANGO_SETTINGS_MODULE env in your scrapy project's settings.py

import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'your_django_project.settings'

Now you can use DjangoItem in your scrapy project.

Edit:
You have to make sure that the your_django_project projects settings.py is available in PYTHONPATH.