logging_strict

joined 1 year ago
[–] logging_strict@programming.dev 1 points 2 days ago (1 children)

There is an expression, Linux isn't free it costs you your time. Which might be a counter argument against always using only what is built in.

I'm super guilty of reinventing the wheel. But writing overly verbose code isn't fun either. Never seem to get very far.

[–] logging_strict@programming.dev 2 points 2 days ago* (last edited 2 days ago) (3 children)

people are forced to install dependencies

This ^^.

If possible, Python dependency management is a burden would prefer to avoid. Until can't, then be skilled at it!

disclosure: i use/wrote wreck for Python dependency management.

Compiled languages should really live within containers. At all cost, would like to avoid time consuming system updates! I can no longer install C programs cuz on OS partition ran out of hard disk space. Whereas Python packages can be installed on data storage partitions.

for Python, I usually deliver the script as a single .py file I'm sure you are already aware of this. So forgive me if this is just being Captain Obvious.

Even if the deliverable is a single .py file, there is support for specifying dependencies within module level comment block. (i forget the PEP #).

I don’t like that (unless its a shell script, but that is by its nature a dependency hell) You and i could bond over a hatefest on shell scripts, but lets leave this as outside the discussion scope

And your argument As the complexity of a .py script grows, very quickly, comes to a point the deliverable becoming a Python package. With the exceptions being projects which are: external language, low level, or simple. This .py script nonsense does not scale and is exceedingly rare to encounter. May be an indication of a old/dated or unmaintained project.

From a random venv, installed scripts:

_black_version.py
appdirs.py
cfgv.py
distutils-precedence.pth
mccabe.py
mypy_extensions.py
nodeenv.py
packaging_legacy_version.py
pip_requirements_parser.py
py.py
pycodestyle.py
pyi.py
six.py
typing_extensions.py
[–] logging_strict@programming.dev 1 points 4 days ago* (last edited 4 days ago) (5 children)

What is the root basis of your external package reluctance? Please explain cuz that's really where the juicy story lies.

As technologists things change and advance and we have to adapt (or not) with the times. Maintaining the universe by ourselves is impossible, instead almost all of our tech choices are from what's available. And only if/when that is insufficient do we roll up our sleeves.

More and more packages are using click. So there is a good chance if you look at your requirements .lock file that it's already a transitive dependency from another dependency.

Or said another way, show me 5 popular packages that use argparse and not click and use dataclasses and not attrs

[–] logging_strict@programming.dev 2 points 5 days ago (7 children)

why click is based on optparse and not argparse

Applies only to optional args long help

Applies only to subcommands short help

Special mention to how to document positional args. The docs explains the intentional lack of help kwarg for positional args.

./thing.py -h
./thing.py subcommand --help

Lists all the subcommands with one line short description for each subcommand.

Lists detailed docs of one subcommand

My opinion having used both argparse and click, click is simpler cleaner and less time consuming.

[–] logging_strict@programming.dev 1 points 2 weeks ago* (last edited 2 weeks ago)

No endless scroll of algorithmic 'content'

That's not the case. According to the docs,

itter watch [mine|all|#chan|@user] would entail infinite scrolling of content.

itter timeline [mine|all|#chan|@user] [<page>] although there is pagenate the content list is potentially infinite

Admit it! There is no search algorithm for content filtering or finding contacts or blocked users.

Whatever the object is, there is no search algorithm to traverse it intelligently.

For example, say i'm super popular but with a tendency to ghost everyone. Like a stereotypical LINE user faced with unpopular opinions or topics. So list of unfollow'ed (aka blocked) users is approaching infinity.

A truly admirable dedication to being a really horrible human being.

At the local bar, me and my mates have a drinking game where they think up random search criteria to see the kinda categories of people which have been blocked. A weak or nonexistent search algorithm would mean not likely to leave the bar on our feet.

Show me chicks with green hair that posts about both climate doom and vaccines being great for children. Living in USA, Canada, or New Zealand. That has a cat avatar and has either giant earrings or nose piercing.

That should be simple enough.

yep look at that! i ghosted five green haired freaks. Now drink!

[–] logging_strict@programming.dev 2 points 2 weeks ago (1 children)

what's your secret name for the project?

the ludwicks of Void Linux ftw!

i'd actually like to do something else with my lifetime besides constantly being tossed around for no apparent benefit. i'm sure there is a good excuse. There always is.

Appreciate feedback once you've had the chance to evaluate wreck.

Feel free to make an issue. Which is the best way to catch my attention.

In the CHANGES.rst, there are lists for both feature requests and known issues

[–] logging_strict@programming.dev 2 points 1 month ago* (last edited 1 month ago) (2 children)

wreck is a dependencies manager, which is venv aware, BUT is not a venv manager nor does it put dependencies into pyproject.toml. Sticks with good ol' requirement files.

Assumes, for each package, it's normal to be working with several venv.

Syncs the dependencies intended for the same venv.

req fix --venv-relpath='.venv'
req fix --venv-relpath='.doc/.venv'

Across many packages, unfortunately have to resort to manually sync'ing dependencies.

Lessons learned

  • have very recently ceased putting build requirements into requirement files. The build requirements' transitive dependencies should be in the requirement files. wreck does not support this yet. i'm manually removing dependencies like: build wheel setuptools-scm (setuptools and pip already filtered out) and click.

  • syncing across packages is really time consuming. Go thru this pain, then no dependency hell.

Wrote wreck cuz all the other options were combining requirements management with everything including the bathroom sink. build backends ... venv management ... everything goes into pyproject.toml. All these ideas seem to just compound the learning curve.

Less is more especially when it comes to learning curve.

Three-argument pow() now tries calling rpow() if necessary. Previously it was only called in two-argument pow() and the binary power operator. (Contributed by Serhiy Storchaka in gh-130104.)

that's a nail or wart that has been sticking out since forever

read the beta release notes to find out

no spoilers from me

another interesting thing is optimizing runtime using mypyc. This is how our dev toolchain is so quick.

mypy, flake8, isort, ... these kinda packages

Have never tried using mypyc would appreciate anyone sharing their experience with mypyc or other Python package compilers.

4
submitted 4 months ago* (last edited 4 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

Market research

This post is only about dependency management, not package management, not build backends.

You know about these:

  • uv

  • poetry

  • pipenv

You are probably not familiar with:

  • pip-compile-multi

    (toposort, pip-tools)

You are defintely unfamiliar with:

  • wreck

    (pip-tools, pip-requirements-parser)

pip-compile-multi creates lock files. Has no concept of unlock files.

wreck produces both lock and unlock files. venv aware.

Both sync dependencies across requirement files

Both act only upon requirements files, not venv(s)

Up to speed with wreck

You are familiar with .in and .txt requirements files.

.txt is split out into .lock and .unlock. The later is for packages which are not apps.

Create .in files that are interlinked with -r and -c. No editable builds. No urls.

(If this is a deal breaker feel free to submit a PR)

pins files

pins-*.in are for common constraints. The huge advantage here is to document why?

Without the documentation even the devs has no idea whether or not the constraint is still required.

pins-*.in file are split up to tackle one issue. The beauty is the issue must be documented with enough details to bring yourself up to speed.

Explain the origin of the issue in terms a 6 year old can understand.

Configuration

python -m pip install wreck

This is logging-strict pyproject.toml


[tool.wreck]
create_pins_unlock = false

[[tool.wreck.venvs]]
venv_base_path = '.venv'
reqs = [
    'requirements/dev',
    'requirements/kit',
    'requirements/pip',
    'requirements/pip-tools',
    'requirements/prod',
    'requirements/manage',
    'requirements/mypy',
    'requirements/tox',
]

[[tool.wreck.venvs]]
venv_base_path = '.doc/.venv'
reqs = [
    'docs/requirements',
]

dynamic = [
    "optional-dependencies",
    "dependencies",
    "version",
]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements/prod.unlock"] }
optional-dependencies.pip = { file = ["requirements/pip.lock"] }
optional-dependencies.pip_tools = { file = ["requirements/pip-tools.lock"] }
optional-dependencies.dev = { file = ["requirements/dev.lock"] }
optional-dependencies.manage = { file = ["requirements/manage.lock"] }
optional-dependencies.docs = { file = ["docs/requirements.lock"] }
version = {attr = "logging_strict._version.__version__"}

Look how short and simple that is.

The only thing you have to unlearn is being so timid.

More venvs. More constraints and requirements complexity.

Do it

mkdir -p .venv || :;
pyenv version > .venv/python-version
python -m venv .venv

mkdir -p .doc || :;
echo "3.10.14" > .doc/python-version
cd .doc && python -m venv .venv; cd - &>/dev/null

. .venv/bin/activate
# python -m pip install wreck
reqs fix --venv-relpath='.venv'

There will be no avoidable resolution conflicts.

Preferable to do this within tox-reqs.ini

Details

TOML file format expects paths to be single quoted. The paths are relative without the last file suffix.

If pyproject.toml not in the cwd, --path='path to pyproject.toml'

create_pins_unlock = false tells wreck to not produce .unlock files for pins-*.in files.

DANGER

This is not for a faint of heart. If you can avoid it. This is for the folks who often say, Oh really, hold my beer!

For pins that span venv, add the file suffix .shared

e.g. pins-typing.shared.in

wreck deals with one venv at a time. Files that span venv have to be dealt with manually and carefully.

Issues

  1. no support for editable builds

  2. no url support

  3. no hashs

  4. your eyes will tire and brains will splatter on the wall, from all the eye rolling after sifting thru endless posts on uv and poetry and none about pip-compile-multi or wreck

  5. Some folks love having all dependency managed within pyproject.toml These folks are deranged and its impossible to convince them otherwise. pyproject.toml is a config file, not a database. It should be read only.

  6. a docs link on pypi.org is 404. Luckily there are two docs links. Should really just fix that, but it's left like that to see if anyone notices. No one did.

4
submitted 4 months ago* (last edited 4 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

Finally got around to creating a gh profile page

The design is to give activity insights on:

  • what Issues/PRs working on

  • future issues/PRs

  • for fun, show off package mascots

All out of ideas. Any suggestions? How did you improve your github profile?

14
Whats in a Python tarball (programming.dev)
submitted 5 months ago* (last edited 5 months ago) by logging_strict@programming.dev to c/python@programming.dev
 

From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

What should go in a tarball and what should not?

Is it only the build files, python code, and package data and nothing else?

Should it include tests/ folder?

Should it include development and configuration files?

Have seven published packages which include almost all the files and folders. Including:

.gitignore,

.gitattributes,

.github folder tree,

docs/,

tests/,

Makefile,

all config files,

all tox files,

pre-commit config file

My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

Thoughts?

 

PEP 735 what is it's goal? Does it solve our dependency hell issue?

A deep dive and out comes this limitation

The mutual compatibility of Dependency Groups is not guaranteed.

-- https://peps.python.org/pep-0735/#lockfile-generation

Huh?! Why not?

mutual compatibility or go pound sand!

pip install -r requirements/dev.lock
pip install -r requirements/kit.lock -r requirements/manage.lock

The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

What if this is scaled further, instead of one package, a chain of packages?!

 

In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It's non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don't prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

view more: next ›