Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)LS
Posts
5
Comments
164
Joined
1 yr. ago
  • Social engineering malware

    Will save the world by foreshadowning the pypi.org delisting announcement

    Great example of malware that uses social engineering to deceive and only later realize those good intentions were quite harmful.

    One step away from requiring a PoW algo to fund package authors. Which is great unless running on a crap laptop from the stone ages. Then it becomes a barrier to entry or simply broken UX.

    Don't like to hear this, but at some point in time this package will be delisted from pypi.org

  • Hello! i have an inferiority complex. Would like to leave the impression to everyone that i'm a very important person.

    For my packages, how to make imports install duration correlated to my deep inferiority complex? To give the impression of compiling a massive code base written in a low level language. Rather than this ducked typed language static type checked with pre py312 knowhow (which is the truth!).

    American Python packages should run like American motorcycles, bleeding oil all over the road.

    Lets Make America clunky af again

    This may or may not be sarcasm

    It's really really dangerous to expose a parody onto a package author whose written both a build backend and a requirements parser. If Trump found out about this, the build backend might incorporate tariff.

    This is one plugin away from becoming a feature

    Heil cobra!!

  • i use interrogate

    Which has a fun bug. Uses ast to compile every module. If a module contains a compile error, get a traceback showing only the one line that contains the issue, but not the module path nor the line #

    The only way to find the issue is to just test your code base and ignore interrogate until code base is more mature.

    interrogate author and maintainers: UX ... what's that?

    The most obvious bug in the history of bugs ... setuptools maintainers, oh that's a feature request

  • No normal sqa user will have any clue how to do this.

    might wanna take this up with the sqlachemy maintainers OR look at sqlachemy tests for how they are testing driver integration.

    Some other dialect+driver will do the exact same thing as this jbdc driver.

  •  undefined
        
    informix+ifx_jdbc://hostname:port/db_name;INFORMIXSERVER=server_name;delimident=y;user=user;password=pass
    
      

    becomes

     undefined
        
    informix+ifx_jdbc://user:pass@hostname:port/db_name?INFORMIXSERVER=server_name&delimident=y
    
      

    If this is a sqlalchemy engine connect url then it would look like the fixed one.

  • https://github.com/OpenInformix/IfxPy/blob/master/README.md

    This does not appear to support SQLAlchemy. Has a dbapi2 but no official driver+dialect supported by SQLAlchemy.

    In which case, why are you bothering? Time would be better spent working on adding SQLAlchemy support than using the dbapi2 directly. In which case the code would be completely non-portable when decide to switch to other driver+dialect

  • Regular dependencies lack host url and hashes. Those are most important.

    For the full details, encourage you to read pep751

    ^^ look a link! Oh so clickable and tempting. Go ahead. You know that pretty blue font-color is just asking for it. And after clicking the font-color changes colors. Wonder what font-color it'll become? Hmmmm

  • i love requirements files, venv, and pyenv.

    Bringing requirements into pyproject.toml does not have enough value add to bother with. My requirements files are hierarchical. Extensively using -r and -c options AND venv aware.

    pep751 does bring value, by stating both the host url and the hash of every package.

    setuptools will fight this to continue their strange hold on Python

  • i'm sad to report

    wreck 0.3.4.post0 no longer emits build front end options into .lock files wreck#30.

    Background of efforts to beg and plead for setuptools maintainers to bend ever so slightly.

    Continuing from denied way to pass build front end options thru requirement files so know non-pypi.org hosts setuptools#4928

    This hurts those hosting packages locally or remotely on non-pypi.org package index servers. For those who are, the packages themselves give no clue where the dependencies and transitive packages are hosted.

    Each and every user would need to have a ~/.pip/pip.conf or pass --extra-index-url pip install cli option. And somehow know all the possible package index servers.

    This allows the pypi.org cartel to continue along it's merry way unimpeded.

    Wish pep751 good luck and may there be a .unlock equivalent. Do not yet understand how the pep751 implementers will bypass setuptools and build.

  • Viva la package dependencies!

    Does it do away with setuptools? After my experience interacting with the maintainers, now refer to that package as, The Deep State

    The Deep State only supports loading dependencies from pypi.org Which has many advantages right up until it doesn't.

    This new standard contains dependency host url. Hope there is a package other than setuptools that supports it.

    When bring it up, and prove it, the responses alternate between playing dumb and gaslighting. The truth is The Deep State are gate keepers. And they are in the way.

    Training wheels off mode please! So there is support for requirements files that contain on which server dependencies are hosted with more than one choice. Would like the option to host packages locally or remotely using pypiserver or equivalent.

    On the positive side, setuptool maintainers did not suggest voodoo dolls, try to wait out the planetary alignment, better economic conditions, or peace on Earth.

    That's how the conversation comes off to my eyes. But form your own opinion. Especially enjoyable for folks who also enjoyed the TV series, The Office.

    What are the alternatives to being stonewalled by setuptools?

    Disclosure: Wrote requirements rendering package, wreck. I have my own voodoo dolls and plenty of pins

  • Learn Sphinx which can mix .rst and .md files. myst-parser is the package which deals with .md files.

    Just up your game a bit and you'll have variables similar to Obsidian tags which doesn't cause problems when being rendered into html web site and pdf file

  • The way forward is to make a unittest module (unfortunately author not using pytest). With characters that are taken as an example from each of the four forms.

    THEN go to town testing each of the low level functions.

    Suspect the test coverage is awful. mypy and flake8 also awful.

  • There are several forms

    K1 NoSymbol K2 NoSymbol characters with lower/upper case forms

    K1 K2 K1 K2 unicode <= 256 with no lower/upper case forms. Like | or + symbol

    K1 K2 K3 NoSymbol 2 bytes latin extended character set

    K1 K2 K3 K4 3 bytes like nuke radiation emoji

    Non-authoritative guess. Having played around with xev together with onboard virtual keyboard with my symbols layout.

  • keysym 0 and 2 are for lower and upper case. If the character has an upper and lower case equivalents.

    This is documented in keysym_group when it should be documented in keysym_normalize

     undefined
        
    In that case, the group should be treated as if the first element were
    the lowercase form of ``K`` and the second element were the uppercase
    form of ``K``.
    
      
  • Python @programming.dev
    logging_strict @programming.dev

    Dependency management

    Market research

    This post is only about dependency management, not package management, not build backends.

    You know about these:

    • uv
    • poetry
    • pipenv

    You are probably not familiar with:

    • pip-compile-multi(toposort, pip-tools)

    You are defintely unfamiliar with:

    • wreck(pip-tools, pip-requirements-parser)

    pip-compile-multi creates lock files. Has no concept of unlock files.

    wreck produces both lock and unlock files. venv aware.

    Both sync dependencies across requirement files

    Both act only upon requirements files, not venv(s)

    Up to speed with wreck

    You are familiar with .in and .txt requirements files.

    .txt is split out into .lock and .unlock. The later is for packages which are not apps.

    Create .in files that are interlinked with -r and -c. No editable builds. No urls.

    (If this is a deal breaker feel free to submit a PR)

    pins files

    pins-*.in are for common constraints. The huge advantage here is to document why?

    Python @programming.dev
    logging_strict @programming.dev

    Feedback on gh profile design

    Finally got around to creating a gh profile page

    The design is to give activity insights on:

    • what Issues/PRs working on
    • future issues/PRs
    • for fun, show off package mascots

    All out of ideas. Any suggestions? How did you improve your github profile?

    Python @programming.dev
    logging_strict @programming.dev

    Whats in a Python tarball

    From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

    What should go in a tarball and what should not?

    Is it only the build files, python code, and package data and nothing else?

    Should it include tests/ folder?

    Should it include development and configuration files?

    Have seven published packages which include almost all the files and folders. Including:

    .gitignore,

    .gitattributes,

    .github folder tree,

    docs/,

    tests/,

    Makefile,

    all config files,

    all tox files,

    pre-commit config file

    My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

    Thoughts?

    Python @programming.dev
    logging_strict @programming.dev

    PEP 735 does dependency group solve anything?

    PEP 735 what is it's goal? Does it solve our dependency hell issue?

    A deep dive and out comes this limitation

    The mutual compatibility of Dependency Groups is not guaranteed.

    -- https://peps.python.org/pep-0735/#lockfile-generation

    Huh?! Why not?

    mutual compatibility or go pound sand!

     undefined
        
    pip install -r requirements/dev.lock
    pip install -r requirements/kit.lock -r requirements/manage.lock
    
      

    The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

    Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

    Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

    What if this is scaled further, instead of one package, a chain of packages?!

    Python @programming.dev
    logging_strict @programming.dev

    constraint vs requirement. What's the difference?

    In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

    Say have docs/requirements-pip-tools.in

     undefined
        
    -r ../requirements/requirements-prod.in
    -c ../requirements/requirements-pins-base.in
    -c ../requirements/requirements-pins-cffi.in
    
    ...
    
      

    The intent is compiling this would produce docs/requirements-pip-tool.txt

    But there is confusion as to which flag to use. It's non-obvious.

    constraint

    Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

    Does not support:

    • editable mode (-e)
    • extras (e.g. coverage[toml])

    Personal preference

    • always organize requirements files in folder(s)
    • don't prefix requirements files with requirements-, just doing it here
    • DRY principle applies; split out constraints which are shared.