I am sure if were in the non-English countries, you might need to use code pages other than CP1252 to support your characters, and 1256 would your right choice for example if you been in Arab speaking countries, but this makes a problem when it comes to Analysis Cubes installation.
Once you install the Analysis Cubes and finish all wizard issues you feel like things are becoming easy and will proceed to start your job getting your cubes inititalized and then you will be start getting below errors:
Results to CompanyMaster Task OLE DB Destination [22]
Description: The column "INTERID" cannot be processed because more than one code page (1252 and 1256) are specified for it.
End ErrorError: 2015-03-23 10:18:58.26
Code: 0xC02020F4
Source: Copy Data from Results to CompanyMaster Task OLE DB Destination [22]
Description: The column "CMPNYNAM" cannot be processed because more than one code page (1252 and 1256) are specified for it.
End ErrorWarning: 2015-03-23 10:18:58.26
Code: 0x800470C8
Source: Copy Data from Results to CompanyMaster Task OLE DB Destination [22]
Description: The external columns for component "OLE DB Destination" (22) are out of synchronization with the data source columns. The external column "CompanyID" needs to be updated. The external column "CompanyName" needs to be updated.
End WarningError: 2015-03-23 10:18:58.26
Code: 0xC004706B
Source: Copy Data from Results to CompanyMaster Task SSIS.Pipeline
Description: "component "OLE DB Destination" (22)" failed validation and returned validation status "VS_ISBROKEN".
End ErrorProgress: 2015-03-23 10:18:58.26
Source: Copy Data from Results to CompanyMaster Task
Validating: 50% complete
End ProgressError: 2015-03-23 10:18:58.26
Code: 0xC004700C
Source: Copy Data from Results to CompanyMaster Task SSIS.Pipeline
Description: One or more component failed validation.
End ErrorError: 2015-03-23 10:18:58.26
Code: 0xC0024107
Source: Copy Data from Results to CompanyMaster Task
Description: There were errors during task validation.
End ErrorError: 2015-03-23 10:18:58.26
Code: 0xC00220E4
Source: Run Company
Description: Error 0xC0012050 while preparing to load the package. Package failed validation from the ExecutePackage task. The package cannot run. .
End ErrorWarning: 2015-03-23 10:18:58.26
Code: 0x80019002
Source: DynamicsGP_TWO_to_DynamicsGPWarehouse_Package_Master
Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.
The Execution method succeeded, but the number of errors raised (6) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. End Warning DTExec: The package execution returned DTSER_FAILURE (1).
Started: 10:18:57 AM
Finished: 10:18:58 AM
Elapsed: 0.842 seconds.
The package execution failed.
The step failed.
Obviously it is all about the first error, different code pages are exist and therefore your data cannot be copied from the production database to the data warehouse, going further with investigations I been able to get around this error by unifying the code page in my packages, and this is by simply allowing a flag in cubes SSIS packages to use the default code page “AlwaysUseDefaultCodePage” to be “True” and your issues will no longer be shown:
Now the challenge here you need to extract the packages from the MSDB store, to your Visual Studio and perform this change, save your package and re-import this back to your MSDB, this will need to be done to the following packages:
DynamicsGP_DynamicsGPWarehouse_FiscalPeriods.dtsx
DynamicsGP_DynamicsGPWarehouse_GLBudgets.dtsx
DynamicsGP_DynamicsGPWarehouse_GLMDAGroups.dtsx
DynamicsGP_DynamicsGPWarehouse_GLMDATransactions.dtsx
DynamicsGP_DynamicsGPWarehouse_GLTransactions.dtsx
DynamicsGP_DynamicsGPWarehouse_ItemCurrentQuantity.dtsx
DynamicsGP_DynamicsGPWarehouse_ItemMaster.dtsx
DynamicsGP_DynamicsGPWarehouse_PendingPurchaseOrders.dtsx
DynamicsGP_DynamicsGPWarehouse_PendingSalesOrders.dtsx
DynamicsGP_DynamicsGPWarehouse_PurchaseOrderDetail.dtsx
DynamicsGP_DynamicsGPWarehouse_SalesDetail.dtsx
DynamicsGP_DynamicsGPWarehouse_SalesPerson.dtsx
DynamicsGP_DynamicsGPWarehouse_SalesTerritory.dtsx
DynamicsGP_DynamicsGPWarehouse_TotalExpense.dtsx
DynamicsGP_DynamicsGPWarehouse_TotalRevenue.dtsx
DynamicsGP_DynamicsGPWarehouse_AgingDetails.dtsx
DynamicsGP_DynamicsGPWarehouse_AgingPeriodAmounts.dtsx
DynamicsGP_DynamicsGPWarehouse_AgingPeriods.dtsx
DynamicsGP_DynamicsGPWarehouse_CheckbookRegister.dtsx
DynamicsGP_DynamicsGPWarehouse_CompanyMaster.dtsx
Luckily, I found a script that makes this easier, it allows you to extract all your SSIS packages from MSDB to file system instead of pulling these one by one, it will save the packages to a folder on your “C:\” drive, and all what you need to do then is copying these to your visual studio and start changing the flag, below the script:
;
WITH FOLDERS AS
(
-- Capture root node
SELECT
cast(PF.foldername AS varchar(max)) AS FolderPath
, PF.folderid
, PF.parentfolderid
, PF.foldername
FROM
msdb.dbo.sysssispackagefolders PF
WHERE
PF.parentfolderid IS NULL-- build recursive hierarchy
UNION ALL
SELECT
cast(F.FolderPath + '\' + PF.foldername AS varchar(max)) AS FolderPath
, PF.folderid
, PF.parentfolderid
, PF.foldername
FROM
msdb.dbo.sysssispackagefolders PF
INNER JOIN
FOLDERS F
ON F.folderid = PF.parentfolderid
)
, PACKAGES AS
(
-- pull information about stored SSIS packages
SELECT
P.name AS PackageName
, P.id AS PackageId
, P.description as PackageDescription
, P.folderid
, P.packageFormat
, P.packageType
, P.vermajor
, P.verminor
, P.verbuild
, suser_sname(P.ownersid) AS ownername
FROM
msdb.dbo.sysssispackages P
)
SELECT
-- assumes default instance and localhost
-- use serverproperty('servername') and serverproperty('instancename')
-- if you need to really make this generic
'dtutil /sourceserver localhost /SQL "'+ F.FolderPath + '\' + P.PackageName + '" /copy file;"C:\Packages\' + P.PackageName +'.dtsx"' AS cmd
FROM
FOLDERS F
INNER JOIN
PACKAGES P
ON P.folderid = F.folderid
-- uncomment this if you want to filter out the
-- native Data Collector packages
-- WHERE
-- F.FolderPath <> '\Data Collector'
Once you run this on your SQL Server the result will be set of commands like the below, run these commands using CMD and go to find these in your C:\Packages folder:
dtutil /sourceserver SERVERNAME /SQL "\DynamicsGPWarehouse\Package" /copy file;"C:\Packages\Package.dtsx"
Execute these over CMD and find your files on C:\ drive, select all and paste the files inside your Visual Studio and start your modifications, once done import back updated packages to SQL Integration Services one by one and rerun your job.
Hope that this helps.
Regards,
--
Mohammad R. Daoud MVP - MCT
MCP, MCBMSP, MCTS, MCBMSS
+962 - 79 - 999 65 85
me@mohdaoud.com
http://www.di.jo
No comments:
Post a Comment