Description
uap-python ========== Official python implementation of the `User Agent String Parser <https://github.com/ua-parser>`_ project. Build Status ------------ .. image:: https://github.com/ua-parser/uap-python/actions/workflows/ci.yml/badge.svg :target: https://github.com/ua-parser/uap-python/actions/workflows/ci.yml?query=branch%3Amaster :alt: CI on the master branch .. image:: https://readthedocs.org/projects/uap-python/badge/?version=latest :target: https://uap-python.readthedocs.io/ :alt: Documentation Status Installing ---------- Add ``ua-parser[regex]`` to your project's dependencies, or run .. code-block:: sh $ pip install 'ua-parser[regex]' to install in the current environment. ua-parser supports CPython 3.10 and newer, recent pypy (supporting 3.10), and GraalPy 25. .. note:: The ``[regex]`` feature is *strongly* recommended, the Pure python (no feature) is *significantly* slower, especially on non-cpython runtimes, though it is the most memory efficient. See `builtin resolvers`_ for more explanation of the tradeoffs between the different options. .. _builtin resolvers: https://uap-python.readthedocs.io/stable/guides.html#builtin-resolvers Quick Start ----------- Retrieve all data on a user-agent string ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code-block:: python >>> from ua_parser import parse >>> ua_string = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.104 Safari/537.36' >>> parse(ua_string) # doctest: +NORMALIZE_WHITESPACE, +ELLIPSIS Result(user_agent=UserAgent(family='Chrome', major='41', minor='0', patch='2272', patch_minor='104'), os=OS(family='Mac OS X', major='10', minor='9', patch='4', patch_minor=None), device=Device(family='Mac', brand='Apple', model='Mac'), string='Mozilla/5.0 (Macintosh; Intel Mac OS... Any datum not found in the user agent string is set to ``None``:: >>> parse("") Result(user_agent=None, os=None, device=None, string='') Extract only browser data from user-agent string ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code-block:: python >>> from ua_parser import parse_user_agent >>> ua_string = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.104 Safari/537.36' >>> parse_user_agent(ua_string) UserAgent(family='Chrome', major='41', minor='0', patch='2272', patch_minor='104') For specific domains, a match failure just returns ``None``:: >>> parse_user_agent("") Extract OS information from user-agent string ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code-block:: python >>> from ua_parser import parse_os >>> ua_string = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.104 Safari/537.36' >>> parse_os(ua_string) OS(family='Mac OS X', major='10', minor='9', patch='4', patch_minor=None) Extract device information from user-agent string ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code-block:: python >>> from ua_parser import parse_device >>> ua_string = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.104 Safari/537.36' >>> parse_device(ua_string) Device(family='Mac', brand='Apple', model='Mac') Upgrading --------- Upgrading from 0.x? See `the upgrade guide`_. .. _the upgrade guide: https://uap-python.readthedocs.io/stable/advanced/migration.html
Release History
| Version | Changes | Urgency | Date |
|---|---|---|---|
| 1.0.2 | Imported from PyPI (1.0.2) | Low | 4/21/2026 |
| 1.0.1 | This led to the parser having to be re-created on every access, which is quite expensive for the re2 and regex-based parser and completely negates their purpose. This can be worked around by using a from-import (as Python will do the memoizing in that case, although every such from-import would create its own parser still) from ua_parser import parser and only parsing through that parser (without using the top-level utilities), or by setting the global parser "by hand" e.g. u | Low | 2/1/2025 |
| 1.0.0 | Release 1.0.0 | Low | 11/28/2024 |
| 1.0.0a2 | previous release had a bunch of broken urls (doc and pypi), this should hopefully fix them | Low | 11/26/2024 |
