[MITgcm-support] NumPy and ScientificPython for ITRDA
Daniel Enderton
enderton at MIT.EDU
Sun Nov 14 22:02:19 EST 2004
I was able to build numarray locally in my ITRDA hopespace just fine, but ran
into this problem with the actual Numeric python package (which I need to get
to my ultimately desired Scientific package).
building 'lapack_lite' extension
gcc -pthread -shared build/temp.linux-i686-2.3/Src/lapack_litemodule.o
-L/usr/lib/atlas -llapack -lcblas -lf77blas -latlas -lg2c -o
build/lib.linux-i686-2.3/lapack_lite.so
/usr/bin/ld: cannot find -lcblas
collect2: ld returned 1 exit status
error: command 'gcc' failed with exit status 1
What is the "-lcblas" option for gcc? I could find anything on google about it.
Is this something goofy with ITRDA or the install file? I sent a report off
to a Numeric Python mailing list, so hopefully I hear back from them soon.
Daniel
Quoting Chris Hill <cnh at MIT.EDU>:
> Daniel,
>
> Sounds useful.
>
> Is it possible for you to build them in your ITRDA home space first and
> check that they work and maybe give us a quick demo + maybe a web page with
> an example of how test and useful pointers.
>
> Thanks,
>
> Chris
>
> > -----Original Message-----
> > From: mitgcm-support-bounces at mitgcm.org
> > [mailto:mitgcm-support-bounces at mitgcm.org] On Behalf Of
> > Daniel Enderton
> > Sent: Sunday, November 14, 2004 12:41 PM
> > To: support at mitgcm.org; admin at mitgcm.org
> > Subject: [MITgcm-support] NumPy and ScientificPython for ITRDA
> >
> >
> > Hello ACES admin folk,
> >
> > Would it be possible to have the Numerical Python (numpy) and
> > Scientitic Python modules install on itrda? These links
> > should bring you right to the RPMs.
> >
> > Numeric Python (numpy and numarray):
> > http://sourceforge.net/projects/numpy
> >
> > Scientific Python:
> > http://starship.python.net/~hinsen/ScientificPython/
> >
> > Also, as I am migrating over from the cg01 cluster in EAPS,
> > where should I be running the models? On cg01 I was using
> > the scratch disks, and when there were a couple trials that
> > were complete I would tar up all the data and scp it to the
> > data drives. Is there a similar thing to the scratch disks
> > with itrda?
> > Where should I be running my simulations?
> >
> > Cheers,
> > Daniel
> > _______________________________________________
> > MITgcm-support mailing list
> > MITgcm-support at mitgcm.org
> > http://dev.mitgcm.org/mailman/listinfo/mitgcm-support
> >
>
> _______________________________________________
> MITgcm-support mailing list
> MITgcm-support at mitgcm.org
> http://dev.mitgcm.org/mailman/listinfo/mitgcm-support
>
More information about the MITgcm-support
mailing list