29 Kasım 2016 Salı

Office 365 Account Creation and Bulk Email User Details with Powershell

If you have purchased office365 to be used in your organization, your first duty is to open accounts in office 365 after configuring related DNS settings in your DNS server.

To open these accounts in bulk you can use  office365 admin panel import pane. But without going to admin panel you can manage this tedious work using powershell.

The following powershell script will help you to

  • Create  Accounts
  • Assign Office365 License Pack you have
  • Export Created Account  details including assigned temproary passwords in NewAccountResults.csv file.  


Beforehand you should have  list of your staffs details in users.csv file.



#Pop-up will appear and wait you to enter administrator account credentials
$credential = get-credential

#Imports the installed Azure Active Directory module.
Import-Module MSOnline

#Establishes Online Services connection to Office 365 Management Layer.
Connect-MsolService -Credential $credential

#Create Users
Import-Csv .\users.csv | ForEach-Object {
#Generate Random Password
$office_365_password = ([char[]]([char]33..[char]95) + ([char[]]([char]97..[char]126)) + 0..9 | sort {Get-Random})[0..8] -join ''
#Create User
New-MsolUser -UserPrincipalName $_.UserPrincipalName -DisplayName $_.DisplayName -Password $office_365_password -UsageLocation $_.UsageLocation
}|Export-Csv -Path ".\NewAccountResults.csv"

#Assing License Pack to the unlicensed users
Get-MsolUser -All -UnlicensedUsersOnly  | Set-MsolUserLicense  -AddLicenses "yourdomain365:OFFICESUBSCRIPTION_PACKNAME"

Then you should send the temproary passwords created  to the personal email acounts of your staff by using the following powershell script.


#From : Sender Email Account 
$EmailFrom = "Name Surname <sender@domain.com>"
 
# Reporting: Report on Success and Failure (optional)
$EmailDeliveryNotificationOption = "onSuccess, onFailure"
 
# Server: Your Email SMTP server
$EmailSMTPserver = "mail.example.com"


# Users: csv file path, file includes  Name, PersonelEmail, Office365Account, Password
$SourcePath = ".\mail_merge_powerdshell.csv"
 
# Import csv file
$Users = Import-Csv  -Path $SourcePath
 
# ####################
# END Variables
# ####################
 
# Begin Loop: Do the following with each row of the file you imported, referencing columns by their header
foreach ($User in $Users) {
 
# To: User's email address
$EmailTo = $User.PersonelEmail
 
# Subject: Email subject (may merge variables)
$EmailSubject = "About Your Office365 Account " + $User.Name + "."
 
# Body: Email body, with HTML formatting
$EmailBody = "<!DOCTYPE html PUBLIC ""-//W3C//DTD XHTML 1.0 Transitional//EN"" ""http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"">"
$EmailBody += "<html xmlns=""http://www.w3.org/1999/xhtml""><head>"
$EmailBody += "<meta http-equiv=""Content-Type"" content=""text/html; charset=UTF-8"" />"
$EmailBody += "<meta name=""viewport"" content=""width=device-width, initial-scale=1.0""/>"
$EmailBody += "<title>" + $EmailSubject + "</title>"
$EmailBody += "</head><body bgcolor=""#FFFFFF"" style=""font-family: sans-serif; color: #000000"">"
$EmailBody += "<p>Dear " + $User.Name + ":</p>"
$EmailBody += "<p>Our Organization has purchased Office365 subscription for 1 Year</p>"
$EmailBody += "<li>Your Office365 Account: <strong>" + $User.Office365Account + "</strong></li></ul></p>"
$EmailBody += "<li>Your Office365 Account Password: <strong>" + $User.Password + "</strong></li></ul></p>"
$EmailBody += "<li>You can Access Office Login to your office 365 using <a href='https://login.microsoftonline.com/'>This Link </a>"
$EmailBody += "</body></html>"
 
echo $EmailBody
echo $Users
# Merge: Conduct the email merge, sending emails (remove -WhatIf)
Send-MailMessage -To $EmailTo -From $EmailFrom -Subject $EmailSubject -Body $EmailBody -BodyAsHTML -SmtpServer $EmailSmtpServer -DeliveryNotificationOption $EmailDeliveryNotificationOption

28 Ekim 2016 Cuma

Dspace and Piwik Integration

Piwik definition from the FAQ of piwik.org is as below.
Piwik is a downloadable, Free (GPL licensed) web analytics software platform. It provides detailed reports on your website and its visitors, including the search engines and keywords they used, the language they speak, which pages they like, the files they download and so much more. Piwik aims to be an open source alternative to Google Analytics. Piwik is PHP MySQL software which you download and install on your own webserver. At the end of the 5-minute installation process you will be given a JavaScript tag. Simply copy and paste this tag on websites you wish to track (or use an existing plugin to do it automatically for you).
 After installation and configuration of piwik you can track  visitors analytics of your institutional repo. If you want to embed these analytics into Dspace you can use the widgets of Piwik.  There are three places that you can embed the analytics.

1-)By adding iframe widget code into Community/Collection introductory, news html sections
2-)By adding iframe  widget code into page-structure.xsl
3-)By adding iframe widgt code into item-view.xsl

First method is the easiest method. You go the widgets menu through the admin panel of Piwik and get the iframe embed code.


Then you go to a collection/community and edit which you want to add the widget as below.

Then update the collection/community and that's it.  You will see the analytics in the community/collection page. But one more point is missing the widget will show whole analytics of your DSpace repo. You should segment  analytics that will only reflect the analytics  of the collection/community you want. To do that you can use page-url parameter with the url of the community/collection. Let's assume that your collection url is
                  
http://dspace.university.edu/handle/11111/23 

Then you should add the following code to the widget code

segment=pageUrl%3D%3Dhttp%253A%252F%252Fdspace.university.edu%252Fhandle%252F11111%252F23

You should add this after the yesterday keyword of the embed code then the iframe code will become as below



The second and third method are nearly same but the places are different. If you want to add analytics to all pages you should use page-structure.xsl. But if you want to only add an analytic to item pages you should use the item-view.xsl. Now let's pass to the method. Open page-structure.xsl or item-view.xsl with your favorite editor (mine is nano) and add the following template codes to the xsl file. Lets add an realtime map to all pages in the header part of Dspace.


nano /dspace/webapps/xmlui/themes/Mirage2/xsl/core/page-structure.xsl

We add the template call code after the following lines




offcanvas
row-offcanvas-right">

 <xsl:call-template name ="realtimemap"/>


then in the bottom of the page we will define the template. 


 
Attention you should all change & characters with & in xsl files in the iframe embed code!
You have noticed that there is variable {$piwik-uri} in the embed code segmentation part. This is the url variable that you should define in the global-variables.xsl as below ,




Then you will see the realtime on all pages of your Dspace.








21 Eylül 2016 Çarşamba

Linux için 2-Aşamalı Kimliklendirme

Eğer siz de benim gibi güvenlik alanında paranoyaklık seviyesini 1 tık arttırdıysanız iş ve ev bilgisayarı olarak bir linux dağıtımı kullanmaya başlamışınızdır çoktan. Kullanıcı girişlerinde ve sudo ile işlem yaparken sadece şifre yerine önce tek kullanımlık OTP sonrada şifre kullanmak istiyorsanız yani iki aşamalı kullanıcı kimliklendirme metodunu kullanmak istiyorsanız ubuntu repolarında google authenticator ile beraber çalışan bir pam modülü mevcut. Bunu aktif edip cep telefonu uygulamasını da ayarladıktan sonra sudo ile işlem yaparken cep telefonunuzda oluşan OTP güvenlik kodunu ve şifrenizi girerek kullanmanız yeterli olacaktır. Gelelim bu dediklerimi nasıl yaptığımıza ;

1-) Öncelikle bilgisayarımıza pam modülünü kuruyoruz. Cep telefonumuza da marketten Google Authenticator uygulamasını kuruyoruz.

sudo apt-get install libpam-google-authenticator

2-) Daha sonra kurduğumuz bu pam modülünü ayarlardan aktif ediyoruz.
Bunun için  /etc/pam.d/common-auth dosyasında aşağıdaki satırı buluyoruz.

auth    [success=1 default=ignore]      pam_unix.so nullok_secure

ve hemen bu satırın üzerindeki satıra aşağıdaki satırı ekliyoruz.

auth required pam_google_authenticator.so

Böylelikle pam modülümüzü aktif etmiş oluyoruz. Şimdi sıra geldi her bir kullanıcı için pam modülünün ayarlarını yapmaya.Sistemimizde varolan her kullanıcı için aşağıdaki işlemleri yapmamız gerekmektedir.

3-) Hali hazırda girilmiş kullanıcı terminalinde iken aşağıdaki komutu çalıştırıyoruz cep telefonumuzdan da google-authenticator uygulamızı açmış şekilde bekliyoruz.


 ~$ google-authenticator


 

İlk soruyu y ile geçtikten sonra ekrana çıkan barkodu google-authenticator uygulamasından sağ üst köşeden Set up account --> Scan a barcode diyerek tarıyoruz. bu aşamadan sonra cep telefonumuzda hesap eklenmiş ve dakikada
bir OTP güvenlik kodu üretiyor olacaktır.  Your emergency Codes are diye çıkan 5 satırlık acil durum kodunu da güvenli bir yere kaydetmeyi sakın unutmayın. Bu kodlar telefonunuzu çaldırırsanız veya kaybederseniz hesabınıza giriş yapabilmenizi sağlayacaktır.
 Diğer gelen sorulara da y diyerek sonlandırın.
Artık sistemden logout olduktan sonra bilgisayarınız sizden önce verification code isteyecek sonra şifrenizi isteyecektir. Verification code yerine telefonda beliren kodu yazın ve enter a basın daha sonra şifrenizi girin ve enter a basın ve sisteme giriş yapın.


22 Haziran 2016 Çarşamba

ScienceDirect Live Import to DSpace

Entering all the metadata is really tedious work so Sciencedirect and Atmire has developed a plugin for dspace which ingest Sciencedirect articles into DSpace easily.
After you apply the patch which is sent from Sciencedirect integration support. get API KEY from the link http://dev.elsevier.com/myapikey.htmlwrite this api key into the configuration file ${dspace.dir}/config/modules/elsevier-sciencedirect.cfg like this elsevier-sciencedirect.api.key =a2e3feffgd3542345gsdf
 then  mvn and fresh install your DSpace if you successfully build and deploy DSpace you will have Elsevier Import menu both in the submission steps and Global Import menu. You can watch it below.


 
 This feature will be readily available in upcoming  DSpace 6.0


10 Mayıs 2016 Salı

Creating proxy pac files automatically from Squid Conf file.

Pac files are automatic proxy configuration files written in js format that browsers can interpret and route clients to the proxy according to the domain name they visit. This pac file should be published through a web server or can be manually downloaded and the browser should be configured to the downloaded pac file. Squid is an open source proxy server which can be configured to be used by libraries for out off campus access to the academic database subscriptions . When someone add/delete/update an entry in the squid conf file, pac files should also be updated properly. This is a tedious task for large numbers of urls. Python script which automates this task with per minute cron entry

The script pushes syslogs to the syslog of the system on every execution. If any changes made to the squid file, the pac file is updated by checking the hash of the old and new states of the file. Also the squid proceess is restarted if any change has been made.



import hashlib
import os
import syslog
import datetime
import shutil
import subprocess
squid_config_file_path="/etc/squid/kutuphane_veritabanlari.squid"
squid_config_file = open(squid_config_file_path)
readFile = squid_config_file.read()
pac_file_root="/var/www/html"
hash_file="%s/hash_file"%pac_file_root
old_pac_file="%s/proxy.pac"%pac_file_root
date_=datetime.datetime.now() 
backup_file="%s/pac_backup_file_%s.pac"%(pac_file_root,date_)
new_pac_filee="%s/new_pac_file.pac"%pac_file_root
sha1Hash = hashlib.sha1(readFile)
sha1Hashed = sha1Hash.hexdigest()
if os.path.exists(hash_file):
   hash_filee = open(hash_file,'r')
   readHash = hash_filee.readline()
   hash_filee.close()
   if not readHash==sha1Hashed:
 syslog.syslog(syslog.LOG_WARNING,"Hashes are different  kutuphane_veritabanlari.squid config file has been modified the old hash :%s new hash:%s "%(readHash,sha1Hashed))
 shutil.copyfile(old_pac_file,backup_file) 
 new_pac_file = open(new_pac_filee,"w+")
    new_pac_file.write("function FindProxyForURL(url, host) {\n")
 new_pac_file.write("var proxyserver = 'proxy2.iyte.edu.tr:8080';\n")    
 new_pac_file.write("var proxylist = new Array(\n")
    with open(squid_config_file_path) as openfileobject:
           for line in openfileobject:
          if (line[0].isalpha() or line.startswith(".")):
   new_pac_file.write('"%s",\n' % (line.rstrip()))
        new_pac_file.write('"dummy.com"\n);\n')
    new_pac_file.write("for(var i=0; i<proxylist.length; i++) {\n")
    new_pac_file.write("\tvar value = proxylist[i];\n")
    new_pac_file.write("\t\tif (dnsDomainIs(host, value) ) {\n")
    new_pac_file.write("\t\treturn 'PROXY '+proxyserver;\n")
    new_pac_file.write("\t}\n")
    new_pac_file.write("}\n")
    new_pac_file.write("return 'DIRECT';\n")
    new_pac_file.write("}")
 new_pac_file.close()
 os.remove(hash_file) 
 hash_file_create=open(hash_file,'w')
     hash_file_create.write(sha1Hashed)
     hash_file_create.close() 
 shutil.copyfile(new_pac_filee,old_pac_file)
 subprocess.call("%s %s %s" % ('service', 'squid', 'reload'),shell=True)
   else:
 syslog.syslog(syslog.LOG_WARNING,"Hashes are same kutuphane_veritabanlari.squid config file hasn't been changed.")
else:
    hash_file_create=open(hash_file,'w')
    hash_file_create.write(sha1Hashed)
    hash_file_create.close()
squid_config_file.close()


İzleyiciler