social.coop is one of the many independent Mastodon servers you can use to participate in the fediverse.
A Fediverse instance for people interested in cooperative and collective projects. If you are interested in joining our community, please apply at https://join.social.coop/registration-form.html.

Administered by:

Server stats:

488
active users

#conda

0 posts0 participants0 posts today

good grief, we broke our #ai #alttext script. for a while we were running the latest #pixtral #vision #model however we wanted to try #Microsoft #PHi4 #vision model however we use #conda for our python virtualization management.

this is a huge mistake #lol #tech #aidev #thestruggle

i need a #venv for both pixtral and phi4 in one scripts run time.

suggestions anyone to work this mess?

#python #development #fail

obviously we are running with the phi4v model however we were testing across all the accounts when we realized we broke the production scripts.

same login, supposedly different virtual environments.

blah. perhaps this is what #uv is meant to fix?

Not really sure, I guess I could talk to #ChatGPT about it lol

I think it's quite problematic on #Linux to start off with regular ol' #Python that came with the system (i.e. /usr/bin/python), and then as you go installing some packages (i.e. on the #AUR if you're on #ArchLinux) which will then install some Python libraries using it, and then you start using something like #Conda or #Miniconda whereby subsequent package installations or updates may be installing these libraries on the Conda environment (i.e. ~/miniconda3/python) and not the system, so there's some overlap there or so? I'm wondering what's the best way of moving forward from this point - esp since sometime ago, it's no longer possible to raw pip install <package> anymore for wtv reason.

I know we like to act like space is completely free these days, but maybe we take that too far.

"Hm, I think I should -v this one tar invocation, just to see what I'm actually backing up from my homedir."

"Sure! Here's your .python directory, containing thousands of files supporting python libraries. And your .local/python directory, containing thousands of files supporting python libraries. Oh, and your .conda directory, never guess what's in there..."

"I get it."

"Do you want to know what's in your .pyenv director~"

"I. Get. It."

(ETA: 1.2GB at the end. Though most of that is just .pyenv; the others are much smaller).

As part of a submission to @joss I had to explore using #conda to build #software for the first time.

I have always been reticent to use it (I like pip for python, and much of my work is on #HPC where environment modules are king.

I wrote a short post reflecting on what I learnt and my new opinions on conda: jackatkinson.net/post/ponderin

jackatkinson.net · Pondering CondaSome ruminations on conda following use in a recent project

Troubleshooting Python Virtual Environment Errors on Windows 11. Common causes include PATH variable issues, environment creation inconsistencies, and permission problems. Learn how to resolve these errors and improve your workflow using advanced techniques & tools like virtualenvwrapper or conda. #PythonVirtualEnvironmentError #Windows11 #Virtualenv #Conda #PythonError #Programming
tech-champion.com/microsoft-wi

Switch to conda-forge or even better start using #pixi pixi.sh/latest/basic_usage/
Anaconda Inc requested research institute where I work to pay for corporate license because researchers are using #anaconda in their daily work. In effect the institute will block access repo.anaconda.com.
I have to say I haven't seen it coming 10+ years ago when I started to use #conda.
#enshittification

pixi.shBasic usage - Pixi by prefix.devTaking your first steps with pixi

What parts of #guix are used when it's hosted?

Survey found:

It's used on top of their #GNU #Linux distro by a third of users

- 50% for package management - same need as #homebrew #conda or #nix usage.

- 41% #dev environments (like #pip or #docker). Super interesting! Lots of comments from users looking for #Nix flakes features

- 28% package their own software.

- 17% for #dotfiles and home environment management. Seems like an opportunity to attract users!?

guix.gnu.org/en/blog/2025/guix

My five minutes of Googling did not return a useful answer, so I’ll ask here:

Has anyone successfully automated the process of migrating #conda environments from the Anaconda defaults (no longer free) repo/channel, to conda-forge?

Like, I understand that this would require modifying base configs *and* running some script in each environment to essentially uninstall packages that came from default, and reinstall the equivalents from conda-forge.

This is still preferable to doing all that manually, and so, developers being developers, surely someone has done this?

Asking on behalf of roughly 15,000 people . . .

After programming a good 2 months in #python finally found the tool #poetry which is quite similar to how #nodejs bundles libraries into a directory either locally or globally in the cache directory.

I have tried the other tools from #pyenv to #venv and/or #virtualenv. Where I thought they were used to deal with library dependency management only to realize that they are more like #nvm.

I did use #conda for some time, though preferred a python only solution. I do realize that poetry won't resolve all issues and might need to look into to containerization later on, though for the time period it looks like a good solution.

Well, after hours fighting to get #python + #quarto + #conda working on an HPC server, I decided to give up and try the path of least resistance that is the jupyter notebook environment provided by the IT people.

Now I'm learning that not only #jupyter notebooks suck themselves, but the jupyter IDE is... kind of bad? Setting the correct wd is a PITA, the file explorer is terrible, horizontal scrollbars hide code, no variable inspector?