Dear Team,
Is there any way to down load the SAP ECC table content and structure manually and upload it on HANA DB.
Regards,
Jo
Dear Team,
Is there any way to down load the SAP ECC table content and structure manually and upload it on HANA DB.
Regards,
Jo
Hi
In Sql we are using the following expression,
SELECT * FROM Customers
WHERE City IN ('Paris','London','Berlin','Bangalore');
If we want to write the same in "CE function" then How we can write.
thanks
Bapin
Dear Team,
I am trying to upload the data from csv format. I have changed date format as "yyyy-mm-dd" and time format "hh:mm:ss".
While importing i have changed data type from nvarchar to Date and Time respectively.
I am getting the below issue.
Message :
Batch from record 2 to 506 failed
java.lang.IllegalArgumentException
at java.sql.Date.valueOf(Unknown Source)
at com.sap.ndb.studio.bi.filedataupload.deploy.populate.PopulateSQLTable.populateTable(PopulateSQLTable.java:93)
at com.sap.ndb.studio.bi.filedataupload.ui.job.FileUploaderJob.uploadFlatFile(FileUploaderJob.java:199)
at com.sap.ndb.studio.bi.filedataupload.ui.job.FileUploaderJob.run(FileUploaderJob.java:61)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
Regards,
Jo
Hi,
I am trying to update a Hana server from sp 80 to 84 using hdblcmgui. Please let me know the following.
1. Master guide says stop processes, what process should i stop?
2. This is a BW on HANA system, anything specific do i have to take care before updating?
3. Any poststeps i have to take care of, after update?
4. Should i ask all users not to use system till update is over?
I am updating server, hostagent and client? how about studio, it has to be updated from respective client machines , correct?
Please let me know.
Thanks
Joyce
Hi Folks,
My earlier thread was rejected, may be i put in very little info.
hdbsql HDB=> IMPORT "TESTING"."<table-name>" from '<location>' WITH REPLACE THREADS 20
* 2048: column store error: table import failed: [30151] Object not found in the import directory;object=TESTING:<table-name> SQLSTATE: HY000
hdbsql HDB=> select count(*) from "TESTING"."<table-name>"
1 row selected (overall time 139.792 msec; server time 10.954 msec)
This is the problem select count(*) for the table works but import for the table doesnt work. Following the syntax.
Hi Team,
How to handle quantity, unit, currency, currency key from flat file in Table. Is there any chances to handle in table or views.
Regards,
Jo
Hello experts,
Is there any special considerations regarding to HANA scale-out system update (without stand by. For example, from revision 72 -> to revision 74. ).
If we just update the this HANA system from the master node, then the worker nodes are automatically updated? or any special options regarding to the distributed one.
I do not find the detailed information in SAP HANA Update and Configuration Guide.
Thanks & Best Regards,
Tong Ning
Hi,
Has anybody come across this error before and can help please?
I am trying to start HDB after initially trying to perform an import from another system using Sapinst (Database Refresh option). Sapinst performed the Create and Load step of the database successfully, but then it timed out on the hdbnsutil -initTopology step. I aborted the sapinst import and have since tried performing a restore in Hana Studio and also a manual inittopology command, but the HDB will not start.
Error from the nameserver_alert_<hostname>.trc file is:
NOTE: full crash dump will be written to /usr/sap/BWQ/HDB35/augdnspi51/trace/nameserver_augdnspi51.33501.crashdump.20141118-201115.036400.trc
Call stack of crashing context:
1: 0x00007f159d6c6901 in TrexThreads::Thread::getState() const+0x0 at Thread.cpp:145 (libhdbbasement.so)
2: 0x00007f159ed39d3a in NameServer::BaseThread::stop(int*)+0x16 at ThreadBase.h:27 (libhdbns.so)
3: 0x00007f159ed0627f in NameServer::TREXNameServer::prepareToShutdown()+0xe0b at TREXNameServer.cpp:2833 (libhdbns.so)
[36413]{-1}[-1/-1] 2014-11-18 20:11:15.958504 e Basis TREXNameServer.cpp(02906) : Process exited due to an error via explicit exit call with exit code 1 , no crash dump will be written
[40995]{-1}[-1/-1] 2014-11-18 20:34:41.892397 e tns_ddl TNSClient.cpp(00917) : setServiceType no topology databaseId use service default: 2
[41233]{-1}[-1/-1] 2014-11-18 20:34:53.769741 e sr_nameserver TNSClient.cpp(06756) : unknown data center 2 when sending request dr_retrieveinifilecontents
[41008]{-1}[-1/-1] 2014-11-18 20:38:33.783375 e assign TREXNameServer.cpp(01810) : assign failed with ltt exception. stopping service...
exception 1: no.3000334 (DataAccess/impl/DisasterRecoveryPrimaryImpl.cpp:1031)
Operation aborted, service in shutdown
exception throw location:
1: 0x00007fb4f86c1c91 in DataAccess::ShutdownAwareBarrier::wait()+0x50 at DisasterRecoveryPrimaryImpl.cpp:1031 (libhdbdataaccess.so)
2: 0x00007fb4f86c416d in DataAccess::SessionRegistry::SessionRegistryItem::waitUntilActive(DataAccess::ShutdownAwareBarrier*)+0x89 at DisasterRecoveryPrimaryImpl.cpp:1441 (libhdbdataaccess.so)
3: 0x00007fb4f86c439b in DataAccess::DisasterRecoveryPrimaryHandlerImpl::waitUntilSecondaryIsActive()+0xa7 at DisasterRecoveryPrimaryImpl.cpp:1053 (libhdbdataaccess.so)
4: 0x00007fb4fe0261c1 in NameServer::TREXNameServer::assign(NameServer::ServiceStartInfo&)+0x20d0 at TREXNameServer.cpp:1760 (libhdbns.so)
5: 0x00007fb4fe0566c7 in NameServer::SelfAssignThread::run(void*)+0x13 at TREXNameServer.cpp:302 (libhdbns.so)
6: 0x00007fb4fc9ea1f2 in TrexThreads::PoolThread::run()+0x850 at PoolThread.cpp:265 (libhdbbasement.so)
7: 0x00007fb4fc9ebd58 in TrexThreads::PoolThread::run(void*&)+0x14 at PoolThread.cpp:124 (libhdbbasement.so)
8: 0x00007fb4fb088ccf in Execution::Thread::staticMainImp(void**)+0x98b at Thread.cpp:475 (libhdbbasis.so)
9: 0x00007fb4fb08921d in Execution::Thread::staticMain(void*)+0x39 at Thread.cpp:545 (libhdbbasis.so)
[42520]{-1}[-1/-1] 2014-11-18 20:38:56.352439 e tns_ddl TNSClient.cpp(00917) : setServiceType no topology databaseId use service default: 2
[42759]{-1}[-1/-1] 2014-11-18 20:39:08.217642 e sr_nameserver TNSClient.cpp(06756) : unknown data center 2 when sending request dr_retrieveinifilecontents
Thanks,
Chaz.
I saw the Jody's proposed solution in this thread: In Hana SQL Script, can I insert records into an already existing table variable? which seems great (which it probably is) and now I am trying to do the same, but it does not seem to work as I hoped.
PROCEDURE temp (IN a; OUT b)
...
DECLARE cond ...
...
CALL proc_demo(c); -- c is returned (OUT) from proc_demo and contains columns 'id' & 'ref'
b = SELECT id AS field1 FROM :c WHERE ref = cond
UNION ALL
SELECT * FROM :a WHERE ref = cond;
Note: 'a' is based on an ABAP dictionary table and contains no column 'id', but column 'field1' among others.
The syntax error is: "Invalid column name; mismatched column in set (UNION ALL)"
Just for testing I also tried (which gives the same error):
b = SELECT field1 AS field1 FROM :a WHERE ref = cond
UNION ALL
SELECT * FROM :b WHERE ref = cond;
I would appreciate any help on this. Please note I am new to SQL so pls explain on a basic level
Hi experts,
I have a ZBW Content with some Caculation Views, when I activated them, it goes to _SYS_BIC, is it normal? Can I change to a ZBW Catalog?
Thanks in advance,
Regards,
Does someone have a url or email to subscribe to RHEL for SAP HANA Channel ?
I would like to download RHEL for SAP HANA ISO DVD or to configure yum repo to be able to install compat-sap-c++
Regards
I have deleted all my Views & Procedures from several packages and it doesnt show that I have anything hidden but when I try and delete the package I get the bellow error message:
Delete Error
Repository:Package is not empty (contains development objects); workspaces:
I have only worked from the modeler perspective not the developer perspective
Your help is greatly appreciated
I looked into the definition of these two HANA views which are a bit complex but look similar, BUT from a 'documentation' point of view they are equal (=information on row and column tables).
Interestingly, there is a difference in the results I get when I query these tables. So, in order to find the differences I ran the following two queries with surprising results:
select schema_name, table_name from tables
minus
select schema_name, table_name from m_tables;
This should give me tables from system view "TABLES" which are NOT in system view "M_TABLES". And there are some.
Vice versa, when running the follwing query:
select schema_name, table_name from m_tables
minus
select schema_name, table_name from tables;
gives a different result set.
Why??
Hi Team,
I have enabled ssl using openssl. I can login into the studio using SSL.
hdbsql i am unable to perform it.
hdbsql -U monitoring -sslprovider openssl -sslkeystore $HOME/.ssl/key.pem -ssltruststore $HOME/.ssl/trust.pem -a -I "sample.sql"
4321: only secure connections are allowed: SQLSTATE: HY000
hdbsql -i 01 -u SYSTEM -p ************ -sslprovider openssl -sslkeystore $HOME/.ssl/key.pem -ssltruststore $HOME/.ssl/trust.pem -a -I "sample.sql".
i am unable to connect via ssl.
Hi All, was just wondering if we can start an individual HANA service( HDBindexserver ) via the command line , not using the studio.
Thank you
Jonu Joy
Hello,
I tried to search HANA Live content for Asset Accounting however it seems it is not available.
Can someone please help me with information for Asset accounting HANA Live content is available or help me understanding why it is not available yet
Regards,
Swati
Hello all,
I have some confusion and need your expertise around embedded BW and cross system reporting.
First let me tell you my situation.
We have CRM on HANA, ECC on HANA and SCM on HANA. My client is not interested in SAP BW but would like to do cross system reporting.
Couple of options I have presented are using the 2 schemas on 1 appliance scenario with SAP HANA Live or using the Smart Data Access can make the cross system reporting possible.
However I want to also explore the embedded BW option. Lets say I want to cross report on SCM and ECC data which are both powered by 2 separate HANA instances.
1) Can I use the embedded BW from SCM and using DXC create an IMDSO in my SAP ECC on HANA system ? How will the deltas be managed as the BW extractors come with certain delta modes ?
2) I am assuming that this IMDSO is not based just for SAP BW on HANA ?
3) Will the data be stored in memory on a constant basis in the IMDSO in my ECC on HANA?
4) Can I load embedded BW extractor content & Tables from the SAP SCM on HANA system using BODS to do ETL (applying logic / transformation/enrichment) and then load into SAP ECC on HANA ? And can I land this data on a table in ECC ?
Can BODS handle deltas in this case like it does for extractors between ECC and BWoH ?
I am trying multiple solutions for cross system reporting without a data warehouse .
Your help is greatly appreciated.
Regards,
Zain
Hi Team,
Could someone guide on how to setup session time out in HANA.
I know it could be change like:
ALTER SYSTEM ALTER CONFIGURATION ('xsengine.ini', 'System') SET('httpserver', 'sessiontimeout') ='7200' WITH RECONFIGURE;
But there is a session timeout in SAP Webdispatcher as well:
icm/server_port_0 = PROT=HTTP,PORT=$(_HTTP_PORT),PROCTIMEOUT=600
icm/server_port_1 = PROT=HTTPS,PORT=$(_HTTPS_PORT),PROCTIMEOUT=600
In a nutshell, How do i modify session timeout settings for XS apps?
Thanks,
Razal