• Stars
    star
    307
  • Rank 136,109 (Top 3 %)
  • Language
    Python
  • License
    BSD 3-Clause "New...
  • Created over 5 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Extract price amount and currency symbol from a raw text string

price-parser

PyPI Version Supported Python Versions Build Status Coverage report

price-parser is a small library for extracting price and currency from raw text strings.

Features:

  • robust price amount and currency symbol extraction
  • zero-effort handling of thousand and decimal separators

The main use case is parsing prices extracted from web pages. For example, you can write a CSS/XPath selector which targets an element with a price, and then use this library for cleaning it up, instead of writing custom site-specific regex or Python code.

License is BSD 3-clause.

Installation

pip install price-parser

price-parser requires Python 3.6+.

Usage

Basic usage

>>> from price_parser import Price
>>> price = Price.fromstring("22,90 €")
>>> price
Price(amount=Decimal('22.90'), currency='€')
>>> price.amount       # numeric price amount
Decimal('22.90')
>>> price.currency     # currency symbol, as appears in the string
'€'
>>> price.amount_text  # price amount, as appears in the string
'22,90'
>>> price.amount_float # price amount as float, not Decimal
22.9

If you prefer, Price.fromstring has an alias price_parser.parse_price, they do the same:

>>> from price_parser import parse_price
>>> parse_price("22,90 €")
Price(amount=Decimal('22.90'), currency='€')

The library has extensive tests (900+ real-world examples of price strings). Some of the supported cases are described below.

Supported cases

Unclean price strings with various currencies are supported; thousand separators and decimal separators are handled:

>>> Price.fromstring("Price: $119.00")
Price(amount=Decimal('119.00'), currency='$')
>>> Price.fromstring("15 130 Р")
Price(amount=Decimal('15130'), currency='Р')
>>> Price.fromstring("151,200 تومان")
Price(amount=Decimal('151200'), currency='تومان')
>>> Price.fromstring("Rp 1.550.000")
Price(amount=Decimal('1550000'), currency='Rp')
>>> Price.fromstring("Běžná cena 75 990,00 Kč")
Price(amount=Decimal('75990.00'), currency='Kč')

Euro sign is used as a decimal separator in a wild:

>>> Price.fromstring("1,235€ 99")
Price(amount=Decimal('1235.99'), currency='€')
>>> Price.fromstring("99 € 95 €")
Price(amount=Decimal('99'), currency='€')
>>> Price.fromstring("35€ 999")
Price(amount=Decimal('35'), currency='€')

Some special cases are handled:

>>> Price.fromstring("Free")
Price(amount=Decimal('0'), currency=None)

When price or currency can't be extracted, corresponding attribute values are set to None:

>>> Price.fromstring("")
Price(amount=None, currency=None)
>>> Price.fromstring("Foo")
Price(amount=None, currency=None)
>>> Price.fromstring("50% OFF")
Price(amount=None, currency=None)
>>> Price.fromstring("50")
Price(amount=Decimal('50'), currency=None)
>>> Price.fromstring("R$")
Price(amount=None, currency='R$')

Currency hints

currency_hint argument allows to pass a text string which may (or may not) contain currency information. This feature is most useful for automated price extraction.

>>> Price.fromstring("34.99", currency_hint="руб. (шт)")
Price(amount=Decimal('34.99'), currency='руб.')

Note that currency mentioned in the main price string may be preferred over currency specified in currency_hint argument; it depends on currency symbols found there. If you know the correct currency, you can set it directly:

>>> price = Price.fromstring("1 000")
>>> price.currency = 'EUR'
>>> price
Price(amount=Decimal('1000'), currency='EUR')

Decimal separator

If you know which symbol is used as a decimal separator in the input string, pass that symbol in the decimal_separator argument to prevent price-parser from guessing the wrong decimal separator symbol.

>>> Price.fromstring("Price: $140.600", decimal_separator=".")
Price(amount=Decimal('140.600'), currency='$')
>>> Price.fromstring("Price: $140.600", decimal_separator=",")
Price(amount=Decimal('140600'), currency='$')

Contributing

Use tox to run tests with different Python versions:

tox

The command above also runs type checks; we use mypy.

More Repositories

1

portia

Visual scraping for Scrapy
Python
8,991
star
2

splash

Lightweight, scriptable browser as a service with an HTTP API
Python
3,898
star
3

dateparser

python parser for human readable dates
Python
2,525
star
4

frontera

A scalable frontier for web crawlers
Python
1,288
star
5

slackbot

A chat bot for Slack (https://slack.com).
Python
1,263
star
6

extruct

Extract embedded metadata from HTML markup
Python
832
star
7

scrapyrt

HTTP API for Scrapy spiders
Python
829
star
8

python-crfsuite

A python binding for crfsuite
Python
770
star
9

spidermon

Scrapy Extension for monitoring spiders execution.
Python
530
star
10

article-extraction-benchmark

Article extraction benchmark: dataset and evaluation scripts
Python
268
star
11

webstruct

NER toolkit for HTML data
HTML
252
star
12

python-scrapinghub

A client interface for Scrapinghub's API
Python
195
star
13

adblockparser

Python parser for Adblock Plus filters
Python
187
star
14

js2xml

Convert Javascript code to an XML document
Python
186
star
15

testspiders

Useful test spiders for Scrapy
Python
183
star
16

scrapy-training

Scrapy Training companion code
Python
171
star
17

skinfer

Skinfer is a tool for inferring and merging JSON schemas
Python
140
star
18

sample-projects

Sample projects showcasing Scrapinghub tech
Python
137
star
19

shub

Scrapinghub Command Line Client
Python
125
star
20

python-simhash

An efficient simhash implementation for python
C
122
star
21

scrapy-poet

Page Object pattern for Scrapy
Python
119
star
22

number-parser

Parse numbers written in natural language
Python
108
star
23

mdr

A python library detect and extract listing data from HTML page.
C
106
star
24

web-poet

Web scraping Page Objects core library
Python
95
star
25

aile

Automatic Item List Extraction
HTML
87
star
26

wappalyzer-python

UNMAINTAINED Python wrapper for Wappalyzer (utility that uncovers the technologies used on websites)
Python
82
star
27

pydepta

A python implementation of DEPTA
C
80
star
28

scrapinghub-stack-scrapy

Software stack with latest Scrapy and updated deps
Dockerfile
60
star
29

aduana

Frontera backend to guide a crawl using PageRank, HITS or other ranking algorithms based on the link structure of the web graph, even when making big crawls (one billion pages).
C
55
star
30

scrapy-autoextract

Zyte Automatic Extraction integration for Scrapy
Python
55
star
31

scrapy-autounit

Automatic unit test generation for Scrapy.
Python
55
star
32

learn.scrapinghub.com

Scrapinghub Learning Center. Report issues in Jira: Report issues in Jira: https://scrapinghub.atlassian.net/projects/WEB
CSS
55
star
33

portia2code

Python
49
star
34

arche

Analyze scraped data
Python
47
star
35

scmongo

MongoDB extensions for Scrapy
Python
44
star
36

exporters

Exporters is an extensible export pipeline library that supports filter, transform and several sources and destinations
Python
40
star
37

webpager

Paginating the web
C
35
star
38

scrapy-frontera

More flexible and featured Frontera scheduler for Scrapy
Python
35
star
39

page_clustering

A simple algorithm for clustering web pages, suitable for crawlers
HTML
34
star
40

flatson

Tool to flatten stream of JSON-like objects, configured via schema
Python
33
star
41

scaws

Extensions for using Scrapy on Amazon AWS
Python
32
star
42

docker-images

Dockerfile
32
star
43

scrapylib

Collection of Scrapy utilities (extensions, middlewares, pipelines, etc)
Python
31
star
44

pycon-speakers

Speakers Spider (PyCon 2014 sprint)
Python
30
star
45

docker-devpi

pypi caching service using devpi and docker
Shell
28
star
46

crawlera-tools

Crawlera tools
Python
26
star
47

scrapinghub-entrypoint-scrapy

Scrapy entrypoint for Scrapinghub job runner
Python
25
star
48

scrapy-mosquitera

Restrict crawl and scraping scope using matchers.
Python
25
star
49

andi

Library for annotation-based dependency injection
Python
20
star
50

kafka-scanner

High Level Kafka Scanner
Python
19
star
51

autoextract-spiders

Pre-built Scrapy spiders for AutoExtract
Python
19
star
52

python-cld2

Python bindings for CLD2.
Python
17
star
53

product-extraction-benchmark

Jupyter Notebook
16
star
54

python-hubstorage

Deprecated HubStorage client library - please use python-scrapinghub>=1.9.0 instead
Python
16
star
55

shublang

Pluggable DSL that uses pipes to perform a series of linear transformations to extract data
Python
15
star
56

shub-workflow

Python
13
star
57

shubc

Go bindings for Scrapinghub HTTP API and a sweet command line tool for Scrapy Cloud
Go
13
star
58

scrapinghub-stack-portia

Software stack used to run Portia spiders in Scrapinghub cloud
Python
10
star
59

tutorials

Python
8
star
60

pastebin

Python
8
star
61

navscraper

Vanguard ETF NAV scraper
Python
8
star
62

varanus

A command line spider monitoring tool
Python
8
star
63

hcf-backend

Crawl Frontier HCF backend
Python
7
star
64

pydatanyc

Python
7
star
65

autoextract-poet

web-poet definitions for AutoExtract
Python
6
star
66

collection-scanner

HubStorage collection scanner library
Python
5
star
67

locode

Python
5
star
68

adblockgoparser

Golang parser for Adblock Plus filters
Go
4
star
69

autoextract-examples

Jupyter Notebook
4
star
70

webstruct-demo

HTTP demo for https://github.com/scrapinghub/webstruct
Python
4
star
71

shub-image

Deprecated client side tool to prepare docker images to run crawlers in Scrapinghub - please use shub>=2.5.0 instead
Python
4
star
72

docker-cloudera-manager

Run Cloudera Manager in docker
Dockerfile
3
star
73

custom-images-examples

Examples of custom images running on Scrapinghub platform
3
star
74

hubstorage-frontera

Hubstorage crawl frontier backend for Frontera
Python
3
star
75

httpation

Erlang
3
star
76

xpathcsstutorial

[Work in progress] XPath & CSS for web scraping tutorial
Jupyter Notebook
3
star
77

epmdless_dist

Erlang
2
star
78

egraylog

Erlang
2
star
79

scrapinghub-conda-recipes

Conda packages for scrapinghub channel
Shell
2
star
80

pydaybot

Demo bot for Python Day Uruguay 2011
Python
2
star
81

erl-iputils

Erlang
1
star
82

jupyterhub-stacks

A docker images for jhub cluster
Python
1
star
83

cld2

Compact Language Detector 2
C++
1
star
84

scrapinghub-stack-hworker

[DEPRECATED] Software stack fully compatible with Scrapy Cloud 1.0
Python
1
star
85

crawlera.com

crawlera.com website
HTML
1
star
86

discourse-sso-google

Use Google as Single-Sign-On provider for Discourse
Python
1
star
87

pkg-opengrok

Ubuntu packaging for OpenGrok
Shell
1
star