Import data to a different tablespace using Data Pump

In a previous article I looked at how one could force imp to place user objects in the default tablespace of a user, in cases where the originating tablespace was not available in the destination database.

The newer Oracle Data Pump utilities, expdp and impdp provides a remapping option that lets you migrate data to a new tablespace during import. Data Pump was made the preferred way of doing imports and exports in Oracle 11g.

Data Pump reads and writes data to a directory on the database server, so the first thing we must do is define such a directory on both source and destination databases. The directory must be acessible by the oracle database.

SQL> CREATE DIRECTORY dpxfer1 AS ‘/u01/dpxfer1’;

We must also give the schema user access to the Data Pump directory:

GRANT READ,WRITE ON DIRECTORY dpxfer1 TO dbuser;

Then we’re set for the actual export. A number of options from the exp utility are available and will automatically be replaced by their new expdp equivalent

expdp dbuser/password DIRECTORY=dpxfer1 DUMPFILE=dbuser.dmp LOGFILE=dbuser.log CONSISTENT=Y

Legacy Mode Parameter: “consistent=TRUE” Location: Command Line, Replaced with: “flashback_time=TO_TIMESTAMP(‘2012-10-15 14:43:06’, ‘YYYY-MM-DD HH24:MI:SS’)”

As soon as we have transferred the dump file /u01/dpxfer1/dbuser.dmp to the destination server, we’re ready to run the import with the REMAP_TABLESPACE option, in this case placing any objects from the USERS tablespace on the source database into the DBUSER table space on the destination database. Naturally the user and table space must already exist on the destination, and the data pump directory as well as the associated user privilege must be added just like we did on the source database.

impdp dbuser/password DIRECTORY=dpxfer1 dumpfile=dbuser.dmp logfile=dbuser_imp.log REMAP_TABLESPACE=users:dbuser

 

Posted in Oracle, Technical | Tagged , , , , | Leave a comment

Import data to a different tablespace using imp

In the old days, it was common to store objects from assorted users in the USERS tablespace. In more modern days, it sometimes seems more appropriate to store user objects in a tablespace dedicated to that schema.

So, as I was transferring data to a new Oracle server, I found myself wishing to import data into a different tablespace than on the source database.

While the impdp tool includes a remap option to take care of such a scenario, the traditional imp tool does not. A workaround is to force all user objects into the default tablespace of that user, however this only works if the source tablespace does not exist.

As an example, I have a user blog_data on OLDDB with it’s objects (in this case, a single table with a single row, and the associated primary key index) stored in the users tablespace.

I’ll export the schema using something like:

exp blog_data/secret@OLDDB file=blog_data.dmp

Then on the destination database, I need to create a user with appropriate privileges, default tablespace and make sure the USERS tablespace does not exist. Assuming this is a new installation, I can simply set a new default permanent tablespace for the database and delete the empty “USERS” tablespace:

CREATE TABLESPACE users2 DATAFILE SIZE 100M;

ALTER DATABASE DEFAULT TABLESPACE users2;

DROP TABLESPACE USERS;

CREATE TABLESPACE blog_data DATAFILE SIZE 1M;

CREATE USER blog_data IDENTIFIED BY secretpassword DEFAULT TABLESPACE blog_data QUOTA UNLIMITED ON blog_data;

GRANT connect,resource TO blog_data;

Now I can simply import the schema into my new user, forcing imp to place all data in the default tablespace:

[oracle@oratest1 impexptest]$ imp blog_data/secretpassword@NEWDB file=blog_data.dmp

Import: Release 11.2.0.3.0 – Production on Fri Oct 12 23:15:44 2012

Export file created by EXPORT:V11.02.00 via conventional path
import done in AL32UTF8 character set and AL16UTF16 NCHAR character set
. importing BLOG_DATA’s objects into BLOG_DATA
. . importing table “PEOPLE” 1 rows imported
Import terminated successfully without warnings.

The import completes without warnings, and we can then see that the table is in the right place:

[oracle@oratest1 impexptest]$ sqlplus “blog_data/secretpassword”

SQL> select table_name,tablespace_name from user_tables;

TABLE_NAME TABLESPACE_NAME
—————————— ——————————
PEOPLE BLOG_DATA

Posted in Oracle, Technical | Tagged , | 3 Comments

Documentation for odachk is now available on My Oracle Support.

The documentation for Oracle Database Appliance Configuration Audit Tool is now available on My Oracle Support note 1485630.1.

The odachk script, quite similar to the exachk tool on Exadata, verifies a large number of components in the ODA environment. This includes network and cluster configuration, some best practices as well as firmware and patch versions, and was made available on ODA in patch bundle 2.2.0

When I failed to find any documentation on the ODA version of the tool, I contacted Oracle who then created the above mentioned note with the odachk User Guide attached.

Posted in ODA, Oracle, Technical | Tagged , | Leave a comment

Benchmark tool for Windows network shares

After some tinkering on my filer, I once again found myself looking for a simple tool to check the performance of a network share under Windows (SMB/CIFS). The usual disk benchmark tools such as Crystal DiskMark and AS SSD benchmark typically only works for local block devices.

This time I was in luck, and found a blog entry which pointed out that an older version of  ATTO  seemed to be just the thing for a simple analysis of network share performance: http://sburke.eu/blog/2011/04/quick-disk-benchmark-for-windows/

Of course, now I need to work out why the filer’s external storage performs rather poorly over the network, and possibly how to improve on the situation.

Posted in Technical | Tagged , , , | Leave a comment

No more iBooks for me

I’m not sure why I’m surprised to find that the DRM protected books I bought from the Apple iTunes’s Music Store can’t be easily read in any other way than by the iBooks reader app, available only on iOS devices. Though, somehow I am.

I had a look at Requiem, but had no luck removing the DRM using this tool. Besides, it’s a little unclear whether this is presently allowed under Norwegian law.

Unfortunately, most new popular ebooks seem to be hassled down with digital restrictions of some sort, although at least the books from Amazone Kindle Store are readily available on multiple platforms including Windows, Mac OS, Mac iOS, Android and Kindle. Additionally  there is a web application, the Kindle Cloud Reader, making the books available on other platforms like Linux. It looks like the web app also supports offline reading.

The Norwegian ebooks I bought from stores connected to Bokskya.no, and possibly other ebooks readable by Adobe Digital Editions seems to decrypt nicely with Calibre and some plug-ins.

At the end of the day; I’m not buying any more DRM books from Apple iTunes Music Store.

Oh, and I do love my new Kindle 🙂

Posted in General | Tagged | Leave a comment

SMS generation have reached Oracle HQ

20120923-123252.jpg

Oracle Database Appliance asked me to “pls wait…” during installation 🙂

Posted in General | Tagged , , | Leave a comment

DLNA compatible Desktop streams from VLC

After a good bit of tinkering, I’ve come up with the following setup for generating a DLNA  compatible stream of a Windows or Linux desktop using Video Lan Connection.

Note that there is no audio in this configuration, as it was not a requirement for my project. Also, the screen resolution of the source computer should probably match the one of the stream: 1920 x 1080.

These streams works quite well with our Samsung 5-series Smart-TVs, and should work for most other DLNA or UPNP AV devices capable of h.264 formatted video in 1080p aka Full HD.

For reasons I have yet to figure out, the streams are detected as 4:3 format. I was able to easily work around this by setting the TV to force 16:9.

Typically I experience 5-10 seconds delay between the source computer and the rendering device.

On Linux the built in Screen:// video source seems to work nicely, while on Windows I found an open source DirectShow filter by Roger Pack to work somewhat better: Screen Capture Recorder

 

Linux:

vlc screen:// –screen-fps=25 –sout ‘#transcode{venc=x264{profile=low,preset=ultrafast,tune=animation,keyint=25}, vcodec=h264,width=1920,height=1080,fps=25,acodec=none}:http{mux=ps,dst=:8080/}’ –sout-keep

 

Windows (using screen-capture-recorder):

“C:\Program Files (x86)\VideoLAN\VLC\vlc.exe” -vvv dshow:// –sout=#transcode{venc=x264{profile=low,preset=ultrafast,tune=animation, keyint=25},vcodec=h264,width=1920,height=1080,fps=25, acodec=none}:std{access=http,mux=ps,dst=0.0.0.0:8080} –no-sout-rtp-sap –no-sout-standard-sap –ttl=1 –sout-keep :dshow-vdev=screen-capture-recorder :dshow-adev=none :dshow-caching=1000

 

You should now be able to reach the stream by pointing another instance of VLC to port 8080 of the source machine. You could even test with a second instance of VLC on the same host by opening http://localhost:8080/

The streams should be added to a DLNA or UPNP AV media server capable of serving live streams, as such I have had some luck using Serviio. It’s not open source, but there is a free version available for all of Linux, Windows and OS X.

Posted in Technical | Tagged , , , | Leave a comment

Make WordPress 3.4.2 show double hyphens

By default WordPress shortens two dashes or hyphens (–) to a single dash (-). Not very practical if you need to show a command that uses double-dash options such as the  “–all-databases” in the previous post about mysqldump over ssh.

I found that this behavior could be altered by editing wp-includes/formatting.php and removing the third element of each of the arrays on line 73 and 74 ($static_characters and $static_replacements).

 

Posted in Technical | Tagged | Leave a comment

Ubuntu 12.04 – Ruby on Rails Production

This looks to be a smooth guide for setting up a Ruby on Rails production environment on Ubuntu 12.04.

http://excid3.com/blog/setting-up-ubuntu-12-04-with-ruby-1-9-3-nginx-passenger-and-postgresql-or-mysql/

Posted in Technical | Tagged , , , | Leave a comment

The simplest of mysql backups

Running mysqldump over ssh is a simple way to copy or back up a mysql database from a remote system. Granted, it’s not suited for every scenario, but for a quick dump of the small database on my remote webserver, it’s just the thing.

Quick and dirty, got to love it 🙂

ssh -C webserver01 mysqldump -u root –password=mysecret –all-databases > mydumpfile.dmp

Posted in Technical | Tagged , | Leave a comment