Science topic
Computer Programming - Science topic
Explore the latest questions and answers in Computer Programming, and find Computer Programming experts.
Questions related to Computer Programming
I fit my data in MATLAB with a function and I can see the results and coefficients in the command window. I need this coefficient in order to continue with my codes, but
I can't input this coefficient in my codes, or I can't use the result in my codes.
This is so far the procedure I was trying upon and then I couldn't fix it
As per my understanding here some definitions:
- lexical frequencies, that is, the frequencies with which correspondences occur in a dictionary or, as here, in a word list;
- lexical frequency is the frequency with which the correspondence occurs when you count all and only the correspondences in a dictionary.
- text frequencies, that is, the frequencies with which correspondences occur in a large corpus.
- text frequency is the frequency with which a correspondence occurs when you count all the correspondences in a large set of pieces of continuous prose ...;
You will see that lexical frequency produces much lower counts than text frequency because in lexical frequency each correspondence is counted only once per word in which it occurs, whereas text frequency counts each correspondence multiple times, depending on how often the words in which it appears to occur.
When referring to the frequency of occurrence, two different frequencies are used: type and token. Type frequency counts a word once.
So I understand that probably lexical frequencies deal with types counting the words once and text frequencies deal with tokens counting the words multiple times in a corpus, therefore for the last, we need to take into account the word frequency in which those phonemes and graphemes occur.
So far I managed phoneme frequencies as it follows
Phoneme frequencies:
Lexical frequency is: (single count of a phoneme per word/total number of counted phonemes in the word list)*100= Lexical Frequency % of a specific phoneme in the word list.
Text frequency is similar but then I fail when trying to add the frequencies of the words in the word list: (all counts of a phoneme per word/total number of counted phonemes in the word list)*100 vs (sum of the word frequencies of the targeted words that contain the phoneme/total sum of all the frequencies of all the words in the list)= Text Frequency % of a specific phoneme in the word list.
PLEASE HELP ME TO FIND A FORMULA ON HOW TO CALCULATE THE LEXICAL FREQUENCY AND THE TEXT FREQUENCY of phonemes and graphemes.
Hi,
I need to create a small cluster (2 nodes) using MS Windows and then creating an application that would run on this cluster. I am a beginner with MPI so need a help on it.
I am using Intel MPI 5.1. I have a made some test programs on one node and they are running quite well. But I want to do it on more than one node now. I am confused that how would it work. I would connect them using an ethernet switch. As far as I know MPI makes use of TCP as an underlying transport layer protocol, is it possible to use UDP instead ? If yes, how ?
I had a problem that I failed to draw a head-model by "Draw Head-Model" button (error message below), when I lunched COMETS2, pressed "preprocessing" button, and read "testmodel.node" .
----------
Error using fread
Invalid file identifier. Use fopen to generate a valid file identifier.
Error in CONE_tDCS
Error in gui_mainfcn (line 95)
feval(varargin{:});
Error in CONE_tDCS
Error in matlab.graphics.internal.figfile.FigFile/read>@(hObject,eventdata)CONE_tDCS('pushbutton1_Callback',hObject,eventdata,guidata(hObject))
Error while evaluating UIControl Callback.
----------
[My environment]
OS: win7 professional sp1 64bit
RAM: 8GB
MATLAB ver: R2017a (I tested it on R2018a and R2015b. But there was same trouble.)
Could you tell me the way I can run COMETS2 ?
I have learnt that to implement different task scheduling policies (like RR, FCFS, ACO, SJF), I need to make changes to SubmitCloudlet method in DataCentreBroker class. However, I am having trouble coding the Round Robin Task Scheduling Algorithm. While in VMs, do we have to include Time Quantum or we just assign cloudlet to VMs in a Round Robin way.
Could you please send me the code or help me out with it.
I was trying to install NETCDF-Fortran in ubuntu 16.04.2, but some problems had been occurred. the attached file shows the problem, which I bring that here too:
make[2]: Entering directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
make[3]: Entering directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
PASS: nf_test
PASS: create_fills.sh
PASS: tst_f77_v2
FAIL: ftst_vars
FAIL: ftst_vars2
FAIL: ftst_vars3
FAIL: ftst_vars4
FAIL: ftst_vars5
FAIL: ftst_vars6
FAIL: ftst_types
FAIL: ftst_types2
FAIL: ftst_types3
FAIL: ftst_groups
PASS: ftst_path
FAIL: ftst_rengrps
FAIL: f90tst_vars
FAIL: tst_types
FAIL: tst_types2
PASS: f90tst_vars_vlen
PASS: tst_f90_nc4
FAIL: f90tst_grps
ERROR: f90tst_fill
ERROR: f90tst_fill2
FAIL: f90tst_vars3
FAIL: f90tst_vars4
FAIL: f90tst_vars2
PASS: f90tst_path
FAIL: f90tst_rengrps
PASS: ftst_v2
PASS: ftest
PASS: tst_f90
make[4]: Entering directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
make[4]: Nothing to be done for 'all'.
make[4]: Leaving directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
============================================================================
Testsuite summary for netCDF-Fortran 4.4.3
============================================================================
# TOTAL: 31
# PASS: 10
# SKIP: 0
# XFAIL: 0
# FAIL: 19
# XPASS: 0
# ERROR: 2
============================================================================
See nf_test/test-suite.log
Please report to support-netcdf@unidata.ucar.edu
============================================================================
Makefile:1397: recipe for target 'test-suite.log' failed
make[3]: *** [test-suite.log] Error 1
make[3]: Leaving directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
Makefile:1503: recipe for target 'check-TESTS' failed
make[2]: *** [check-TESTS] Error 2
make[2]: Leaving directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
Makefile:1814: recipe for target 'check-am' failed
make[1]: *** [check-am] Error 2
make[1]: Leaving directory '/home/irrig/Reg/netcdf-fortran-4.4.3/nf_test'
Makefile:534: recipe for target 'check-recursive' failed
make: *** [check-recursive] Error 1
I have already installed gcc, g++ and gfortran, but the problem was same.
I want to know how could I solve the problem.
Dear All,
I try to simulate a moving heat flux. I make sure that I make sure that I linked Abaqus 6.13 with Intel Fortran XE 2013 and Visual Studio 2012.
But, I try an exemple of a subroutine from the manual; It aborded with an error: "Problem during compiling".
It can’t be a syntax problem because the subroutime is from the manual for sur!
Also, when I compile the code in Visual Studio: I obtain an error : error #5102: Cannot open include file ‘ABA_PARAM.INC’
Thank you
- suppose that i wrote a code on MATLAB to track objects
- i hit RUN and it took x milliseconds to be implemented on an ibm machine with certain specs say x gigahertz processing speed , y Gbytes RAM...etc
- but i want to know how much time it takes to run on any machine??
I have a phantom_omni haptic and want to render a deformable shape for this i define a plane of vertexes and when curser collide to shape, vertexes near the collision point moved down and it more or less deform. but the question is , this method is too slow and force feed back and rendering frequency come down , so what method do you suggest me to solve this problem?
The ANN with a backpropagation algorithm is enough, this ANN will be used under the Fortran 95 and Python languages.
Best regards,
Ender Araujo.
Hi
I am Trying to Write a Fortran Code with Visual Studio and i get Stack-overflow error.
My code will run properly by another compiler, but in VS, it just give me this error.
Can someone help me?
I have to insert the initial speed along x and y components of a block, which can fall.
This is for a 2D rock-fall computer program (Rocfall) of a block inserted in a fair sismicity area. I found in a publication
x=1,5 m/s
y=0 m/s
Is it correct? Do you have other suggestions?
Thanks very much.
This relates to a Linux code in which the access of an application is changed from root to non root. Initially in the former case, it was possible to create a TCP server socket by the application but as soon as the non root access was exercised, the socket didn't get created anymore with the same code. Can someone suggest what SE Policies should be added (and to which files) to solve this issue?
For example, I have 4 variables with numeric values (var1, var2, var3, va4). And I would like to create a 5th variable, which would check the maximum value among 4 variables and display the name of the variable as a result (if var1 has maximum value across the original 4 variables, var5 would show the name of var1). Or maybe it can be possible by coding var1 as a numeric expression in var5? Any input is appreciated!
In my last question regarding the choice of architecture between MIPS and Intel x86 variants, I decided to go with MIPS in the new course syllabus and programme for teaching computer architecture, instruction sets and assembly language.
Now, I'm looking for a good MIPS emulator for practical lessons and use in laboratory classes to teach students architecture and assembly programming with use of such a tool, since, obviously, I cannot acquire enough physical units for students to work on.
Can someone suggest adequate MIPS emulator/simulator software with a straight-forward interface that can be used for practical lessons in laboratory classes for this course?
What is the PHP or Javascript code for continuous running the several links without redirecting?
I want to find a value in the range 0-x, using the binary iteration this interval will be halved each iteration but at each iteration I need to solve complex time consuming problem, are there algorithms or approaches faster than the binary iteration algorithm?.
I had byte array which was derived from an image , then this array text was encrypted to be saved in database, I want to retrieve this text decrept it then transfer it to byte array again to retrieve the image , how can this be done?
I'm doing some dosimetry research on the MIGB-I123, MIGB-I131 and 68 Ga-DOTANOC using Internal Dose Computer Program called IDAC but this software doesn't has the preadult phantoms therefore I can't estimate the organ and effective dose for children (1,5,10 and 15 year old patient) in ICRP Publication 103 . After an extensive literature search I noticed that there is a software called OLINDA/EXM® 2.0 which has the preadult phantoms but it is very expensive to buy it for this purpose...
I was wondering is there any free software similar to OLINDA/EXM® 2.0 which has preadult phantoms?
There are general programs that can be harnessed to calculate some of the criteria for evaluating projects economically. But are there any recent programs that have direct relevance to the economic feasibility of projects?
/home/anuradha/Desktop/boltztrap-1.2.5/src/BoltzTraP x_trans BoltzTraP
================ BoltzTraP vs 1.2.5 =============
At line 30 of file gtfnam.F90
Fortran runtime error: Bad integer for item 1 in list input
please help me with this?
Dear colleagues,
I am investigating the population genetic structure of a forest-dwelling bird species using the STRUCTURE program on 9 STR loci.
As my dataset is unbalanced (11 sampling sites, ranging from 4 samples to 56 samples per site, on average 17 samples per site), I want to use the alternative ancestry prior described by Wang (2017).
My question: when entering the alternative ancestry prior "initial ALPHA=0.33" (as the assumed number of populations is three) and checking "seperate Alpha for each population" in the menu "ancestry model>Advanced>Configure", I am unsure on how to deal with ALPHAPRIORA and ALPHAPRIORB. Do these stay in default (0.05 and 0.001 respectively) or do I have to adapt them accordingly?
Other parameters: admixtured model, correlated freq.
Thank you for your help!
Wang J (2017) The computer program structure for assigning individuals to populations: easy to use but easier to misuse. Mol Ecol Resour. 2017 Sep;17(5):981-990
Dear,
I am looking for the MICROSAT software (Microsat (version 1.5): A computer program for calculating various statistics on microsatellite allele data) to insert it into a script, but the links on the internet are giving an error. Someone would have this software to send me (my email: adamrickbessa@gmail.com). Thank you so much.
For example,
to minimize Z= [P1, P2, P3]
subject to some specified constraints
with non-negativity restrictions
where P1, P2, P3 are objective functions with different priority levels.
Means which type of computer program can be used for it.
Thank you very much for your time and help.
Many Computer Science Freshmen come to college with little or no background in computational thinking and almost no computer programming background. In my part of the world, only a very small number of freshmen become adapt at programming at graduation.
This is an artificial project created by the RG computer. I find this unacceptable the RG software should be rectified to avoid such cases.
The samples were done in SOLiD.
The Staden Package program does not come with a tutorial as the gap4 program for sequence alignment, which would have been very useful.
Please step by step would be nice because I understand that there is a lot of linux or unix commands involved and I am pretty green on that department.
I am using a Mac just in case there are differences between other platforms.
For me, I am very sure it is solved. If you have interest, first download the problem, run it. Then, read my paper and think, then, you may also be sure.
How to use the program
1. I believe that most of people who download my program would be professionals. So I please you leave your contacting message and welcome your opinions if you download my program. You can leave your message here or to my email: edw95@yahoo.com. Thanks a lot.
2. This program is an informal one, also it is not the quickest one. But it includes my algorithm, also it can work correctly and works very well. No fails for it.
3. How to use: if you have a 0-1 matrix standing for a simple undirected graph with n vertices which has at least a Hamilton path from vertex 0 to vertex n-1, you only press the “ReadMatrix” menu item to read and calculate it, then you press the “Write the result” menu item, to write the result in a new file, you can get a Hamilton path from vertex 0 to vertex n-1 in the new file.
4. How to use: if you have an edges matrix standing for a simple undirected graph with n vertices which has at least a Hamilton path from vertex 1 to vertex n, you only press the “ReadEdges” menu item to read and calculate it, then you press the “Write the result” menu item, to write the result in a new file, you can get a Hamilton path from vertex 1 to vertex n in the new file. If without such a path, you get a message “no...”. The input file format: each row: 1,3 or 1 3. It means that an edge from vertex 1 to vertex 3.
5. The maximum degree is 3. Though I am very sure my algorithm can calculate any degree of undirected graphs, but this program not. The maximum vertex number is 3000, due to that the PC memory is limited.
6. I would like to thank Professor Alexander Chernosvitov very much. He and his one student take a long time to write a program (different from mine) to implement my algorithm and he give me and my work a good comment (see the web codeproject.com and researchgate.net). Mr. Wang, xiaolong also. Before them, no body trust me. Some not smart enough editors and reviewers reject me just on this logic: for such a hard problem, Lizhi Du is not a famous man, so he cannot solve it. Some editor or reviewer does not use his or her brain, say: your paper is apparently wrong, or, your paper cannot be understood. “apparently wrong”, funny! I study it for a lot of years, apparently wrong! If a reviewer is really powerful and use his brain and cost his time, he surely can understand my paper. If you think I am wrong, tell me where is wrong, then I explain to you that is not wrong. If you think my paper cannot be understood, tell me where cannot be understood, I explain to you. In my paper, in the Remarks, I told how to understand my algorithm and proof. I think it is very clear.
7. I studied this problem for a lot of years. I put a lot of versions of my papers on arxiv. Though the former versions have this or that problems, I am very sure the newest version of my paper is the final version and it is surely correct. It may contains some little bugs due to my English expression, but this does not affect the correctness and I can explain or revise the little bugs easily.
8. Surely I think I have proved NP=P and have solved the problem NP vs. P.
9. Thank you for you pay your attention and time on my algorithm!
Can anyone provide me Arduino and Open BCI code for connecting two external button on 8 channel Open Brain computer interface system ( ADS1299). Please ?
I have installed Rosetta, A computer program for estimating soil hydraulic parameters with hierarchical pedotransfer functions. But if I look into the programs installed to computer I can't find it. I have tried to follow the installation guide from the Arizona university website still it I can't see it on my PC. Could someone help please.
Thanks
How to simulate CSMA-CA algorithm in MATLAB? There are several nodes in the network with its own state. For example, some node may be sending the data at the start of the simulation, some other nodes may be waiting to send the data, some nodes may be idle, some nodes may be waiting for ACK, etc. When some node is sending the data, at the same time the other nodes may be changing the state. Computer programs run sequentially. How it is possible to update the states of other nodes or back-off counters etc. at the same time when the transmission of data of a node is going on?
Of course, this is easy in NS2. But I am not at all familiar with NS2.
Methods is important when teaching programming languages. Alongside the development of computers, one of the aims was to teach all children
computer programming (Resnick et al., 2009). However, the difficulties experienced by students while writing programs on a program-compiler and the use of uninteresting activities in computer programming teaching (Resnick et al., 2009) caused students to consider computer programming a difficult task (Aşkar & Davenport, 2009; Caspersen & Kolling 2009). The idea that computer programming was difficult for students and teachers (Armoni, 2011; Gökçearslan & Alper, 2015) has tried to be removed via practical programs like
Scratch, Alice and AppInventor that were developed for visual programming.
Programming is a difficult process to learn, it is important to choose educational methods.
I found the APDL command of ANSYS is quite diffcult to remember.Some commands are abbreviations and acronyms that are easy to remember,BUT even more commands are seems not to be dedicated designed and quite diffcult to remember and understand especially for its parameters.
Is there any tricks to remember those commands?
I blindly tried ./profit-cli -f test.fits -p moffat:fwhm=1:con=1 but it returns an empty image fits file
Did anyone manage to compile Colin Zelt's FAST package in Linux Operating System (such as ubuntu or redhat) ?
Did anyone manage to compile Colin Zelt's FAST package in Linux Operating System (such as ubuntu or redhat) ?
As we know that FAST was written in Solaris OS,
I downloaded FAST and the patch for installed them in Linux written by Brenders, Andrew from colin zelt's webpage.
But it doesnot work in linux OS.
Is there anyone who has successfully compiled and run them in linux OS or Mac OS?
If you have the successful one , please email them to 1020116@hnust.edu.cn
,thanks.
Dear researchers,
I'm looking for an open-source FEM software compatible with ABAQUS subroutines. I know there are a few open-sources FEM software available. But the question is; Is there any open-source FEM compatible with ABAQUS routines? or can be compatible with minor changes in the code?
To conduct my simulation, I need to run a few routines but they are ABAQUS subroutines and are written in FORTRAN and I don't have access to ABAQUS license for now. I know that ABAQUS has a student version, but this version has limitations among which is a restriction on running routines.
Any comments and pieces of advice are highly appreciated.
Best,
Farhad
In other word , if am get it my project idea from GUI Fuzzy logic by MATLAB using different condtions ,How can route this work to outer by Arduino microcontroller type of 2560 since am using set of sensors to automation something .
I want to know how a sequence is tested in ubuntu terminal. I want to know clearly atleast a clear sequence approach of one test using NIST test suite (sts - 2.1.2). Or any other suggestions to test the randomness using runs test and autocorrelation test.....
I got a code for implementing task scheduling in cloudsim. in that user made a following call to a method in cloudlet:
cloudlet.getCompletedRequests (getClassId.getSimplename())
now there is no such function by default in the cloudlet class. and i am unable to find this method anywhere, even not in cloudlet.java file.
where can i find the defintion of this method.
I am simulating a metal cutting process in Abaqus / explicit and want to implement a damage model using the VUSDFLD and VUHARD subroutines. As it is the first time I am going to write a subroutine and I am not experienced with programming, I do not have idea on how to do it.
Does anyone could share any examples of these two subroutine models (pdfs or a fortran file) from which I can base my work on? It would be a great help!
Thanks
I know MATLAB doesn't have a good performance in for loop computation, and I know that if some variables' size increases at each iteration of the for loop, more and more memory space will be occupied. But in my case, my purpose is to do batch processing to several data files. At the begin of the each iteration, I will clear all the variables except the for loop index (see below), but I noticed that the program turned slower and slower. Can any explain what happened and help me fix it? Thanks!
The framework of the for loop is as follow:
clear;close all;clc;
% define data ID
WTNO = [15,3,1:2,4:14,16:24];
for i = length(WTNO)
clearvars -except WTNO i
load(num2str(WTNO(i)),'.mat');
% data processing
...
% data save
...
end
I developed a code to generate and solve a system of linear equations using SYMBOLIC TOOLBOX. I want to create binary file (.exe file). But MATLAB is not supporting binary file creation as i use symbolic toolbox. A binary is created but not working well.
Help me if you have any ideas.
I use getarg subroutine in my FORTRAN program to call file names and related parameters from command line itself. I would like to print the whole execution command in my output file, so that the output file contains the information about what ever input file and parameters I had used to generate it. Is there a way to do that?
Dear Friends,
I am new in android and developing an autocall receive app . We have a basic code in github which work upto kitkat. It's url is : https://github.com/steghio/auto-answer
Please advise how it can work out in lolipop and above versions. We try it for 3 weeks and above . But no result.
Waiting for a fast, helpful response
Thanks
Anes
Dear all.
Hi, I'm a student from Hanyang univ., writing to get some help.
In any field in engineering, mathematics, etc. it is needed to generate data points in a domain. It can be nodes, points or vertices depending on the field.
It could be to generate mesh for finite element method, or to scatter points for interpolation, estimation or something.
I don't know exactly what it is and how can I google it appropriately.
Is there anyone who knows well about this field?
Thank you
Minsik Seo
I need it for my course on databases. I tried to use EyeDB (http://www.eyedb.org/) but it has a lot of serious bugs.
For many years C++ has been the dominating language to start with. Recently Python is getting an edge over C++ and Java as the most popular programming language. Is it a good idea to switch to Python as a start instead of C++?
We have a wireless receiver system on FPGA and all of data processing units are in fixed-point arithmetic.
We would like to find a freeware software tool which analyse besides the written code also the remarks in the code which replaces a Detailed Design document. (floating licenses).
I am trying to learn DL with an example.However, running the following code (attached in image) results in an error :
Undefined function 'categories' for input arguments of type 'double'.
I understand that categories in the original example is string based labels. But mine is int values. Not sure why it isn't working. Kindly help
One of the most challenging aspects of developing the previous version of ForTrilinos was devising a reference-counting scheme to ensure safe type finalization in Fortran and corresponding object destruction in C++. It's interesting to see that the SWIG-generated file forepetra.f90 on the new develop branch contains an Epetra_Object derived type with a type(c_ptr) component. Will that component become associated with the C++ object that the Fortran object shadows? Given that there's no final subroutine on the type, is there a strategy to avoid memory leaks and dangling pointers? How will the C++ code get notified when the Fortran object goes out of scope so that C++ knows its safe to destroy the object? How will the Fortran code get notified if C++ destroys the object?
What will happen with an expression such as
type(Epetra_Object) :: a,b,c
a = f(b) + g(c)
where f and g are functions that have Epetra_Object results containing c_ptr components that presumably are associated with C++ objects that were instantiated at the direction of Fortran inside each function. After the above assignment completes, the Fortran code no longer has a name associated with the two function results so the Fortran program has no way to direct C++ to destroy the corresponding C++ objects.
Damian
Hello, Dear colleagues
i have a little background about dynamic programming and i want to know if dynamic programming will be sutible technique for solving layout problems?
and what are the most application of DP in manufacturing?
All answers are appreciated
Hi,
I am new to cytoscape and have limited bioinformatics experience. I am having trouble importing XML file generated from STRING into cytoscape 3.4.0.
It is showing
Loading PSI-MI 2.5x XML File
you must have a non null JAXB source.
However,I have created network using .tsv format but computer getting hanged while performing network analysis.
Please help.
Thanks,
Parmita
Please help.
What programming language Should I use? I need the code which execute this results?
web service, response time, hmm model
after replace PC, winQ gives following message: initialing, conveyor clearing Respectively.on conveyor clearing message doesn't change. Does anyone have any trouble shooting advice? We carried out the tasks. turn off and on Quant and PC. uninstall and install COM port. wait 50 hour then turn on Quant
I've done a project on biometric identification whit PPG signal and i want to create a gui for it,do you have any matlab codes or any examples of gui for biometric identification?
I'd like to thank you for very much in advance.
I want to generate a vector of random numbers between 0 and 1 in matlab and I use the "rand " function. However, I need that every time I generate this vector one number only has a value above 0.8 the other numbers will be randomize between one and 0. Any help ?
I'v been working on a python code for an easy data processing and visualization of a simple numerical model written in fortran and parallelised with MPI. I need to use some fortran code from python and I've been unsuccessfully trying to wrapper this fortran/mpi code with f2py. Any suggestions.
I am getting cracking noise with my multimedia project, I have hardware limitation that support only 16 bit integar value , where as incoming data is in float,
So i want to convert float to short with minimum precision losing i have used multiplication technique that is giving much lose, can any body help me for same?
Assume I need a random number between 0-100. What is the difference if I
1. generate a single random number in the range 0-100.
2. generate a single random number in the range 0-10, and square it.
3. generate two independent random numbers namely, r1 and r2, in the range 0-10 and then r1*r2.
Hello dear all,
I want to synchronize all threads of a child kernel before executing other operations in a parent kernel in CUDA. How can I do this? I have many threads in many blocks. I used 1D blocks and 1D grid.
Thank you very much
I need to fix a problem on Nvivo. All my interviewees' audio files are grouped as sources and I can't classify them according to gender, age, etc. Any suggestion on how to solve it?
I have a dictionary which its values are matrices and its keys are the most frequent words in the train file. There is a test file which I have to see if the words in each line of that is in the dictionary (the keys), get their values and add them together and then divide that to the number of the words in each line which matches to the keys. The answer is one matrix. I tried "sum(val)", but it doesn't add the matrices together. How can I fix the code (the end part of that) which I've enclosed?
I am currently learning how to code efficiently. I wrote some codes using Java, Matlab, and SimTalk languages. I want to learn how to code efficiently and neatly, which may allow myself and other programmers to easily understand and modify my codes later. Could you please suggest some good websites which I can refer to? Thank you.
Dear C++ experts,
I'm a beginner with C++ and I've programmed my first numerical game for training. Here, a random number was chosen between 1-15 and the gamer has three chances to guess the right number. After each attempts, the program tells you that the guess number is greater or smaller than the goal.
Would you please evaluate my code and tell me about the efficiency of my program.
I am trying to understand the color of amber glass cantaining reduced iron Fe(II) and sulfide sulfur in relation to crystal field spliting of the d orbitals.
Data Partitioning in Sybase Adaptive
Server®
Enterprise 15 for Lower Costs,
Higher Performance
Can anyone help me to find some info about WATHUN heat transfer computer program?
the complete error is:
"This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information."
the tool is kiva. Kiva is a free and open source ground heat transfer calculation tool written in C++ and Kiva is a command line tool.
Need this ENVI Image file (.Img) for further use in IDL and visualization in IDL.
Chip method doesn't work and provide results needed.
IDL, ENVI img, convert tiff to img
s=[0 1]; Let k{1,2}{1,1}=01010001,k{1,2}{1,2}=11010101,k{1,2}{1,3}=01010001,k{1,2}{1,4}=11010001,{1,2}{1,5}=00010001(continued upto {1,2}{1,7})
if (strcmp(k{1,2}{1,1}, '00000001') (strcmp(k{1,2}{1,1}, '10000000')))&& s(1)==0 p{i,j}='00000001';
elseif (strcmp(k{1,2}{1,1}, '00000001') (strcmp(k{1,2}{1,1}, '10000000')))&& s(1)==1 p{i,j}='10000000';
elseif (strcmp(k{1,2}{1,2}, '0000001') (strcmp(k{1,2}{1,2}, '1000000')))&& s(1)==0 p{i,j}='0000001';
elseif (strcmp(k{1,2}{1,2}, '0000001') (strcmp(k{1,2}{1,2}, '1000000')))&& s(1)==1 p{i,j}='1000000';
elseif (strcmp(k{1,2}{1,3}, '000001') (strcmp(k{1,2}{1,3}, '100000')))&& s(1)==0 p{i,j}='000001';
elseif (strcmp(k{1,2}{1,3}, '000001') (strcmp(k{1,2}{1,3}, '100000')))&& s(1)==1 p{i,j}='100000';
elseif (strcmp(k{1,2}{1,4}, '00001') (strcmp(k{1,2}{1,4}, '10000')))&& s(1)==0 p{i,j}='00001';
elseif (strcmp(k{1,2}{1,4}, '00001') (strcmp(k{1,2}{1,4}, '10000')))&& s(1)==1 p{i,j}='10000';
elseif (strcmp(k{1,2}{1,5}, '0001') (strcmp(k{1,2}{1,5}, '1000')))&& s(1)==0 p{i,j}='0001';
elseif (strcmp(k{1,2}{1,5}, '0001') (strcmp(k{1,2}{1,5}, '1000')))&& s(1)==1 p{i,j}='1000';
it is running fine as expected but what i want is
suppose at the condition
>> elseif (strcmp(k{1,2}{1,5}, '0001') || (strcmp(k{1,2}{1,5}, '1000')))&& s(1)==1 is met.
i want p{i,j}= '00011000'i.e it should contain the value '1000' and its before original values also. similarly for other cases also. kindly can anybody suggest and help me.
I am interested in the history and theory of the Open Source movement, which in the last decade expanded into open data, open access, open science, open government, etc. Is the use of the word "open" in a direct line from Popper? How have these movements stayed true to or diverged from Popper's original conception in The Open Society and Its Enemies (1945)? I am most interested in the theoretical-historical relationship between these cultural movements, which are separated by about 30-40 years. Knowledge of Popper's work is important to that.
When researcher use software or computer program how to cite that in paper
for example how to cite Matlab in academic research
Donald Knuth said:
Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.
When is that holds to be true? How do I know that I am not wasting my time?
The integer division is a common and useful operation in Computer Science. It comes up in many domains, as in the manipulation of matrices and grids.
Is there any formal symbol for this operation? Or at least a widely recognisable symbol that can be easily differentiated from the standard division (i.e. inverse of multiplication)?
Update: Please do not answer explaining how to use the integer division on this or that programming language. This question is strictly on the formalism to represent this operation.
I would appreciate if anyone can point me to latest literature on students(especially first years) learning difficulties in programming.
I know C++ and want to improve my level more. I am thinking to improve my level. What about Python?
I am currently looking to improve tracing method using Mango. However I need a solid protocol. Does anyone have a good protocol for Mango? Thank you in advance!
If we have this nonlinear equation
F=exp[a*(N-Z+1)/A]*b*(1+A^1/3)^2, How we can determine the coefficients a, b, by using MATLAB or other program, the data of F, S, A it in the attached file?
Hello!
I would like to implement a VANET prototype in an Arduino or something similiar and I don't know how to start.
If someone could give me some advice on what do I need to start implementing the protocols and those kind of things, it will be very helpful.
Thanks in advance.
Redis and Memcached systems performs well in hash based tables.
I am trying to explore the tree-based main memory databases.
Some researches wrote in their papers that they replace hash table with B+-tree in Redis.
But i can't find the resources about B+-tree based Redis on the internet.
I was wondering if it is possible that somebody could share the resources about B+-tree based Redis or B+tree based Memcached.
Thanks.
As we all know that VORTEX computer program is a simulation of the effects of deterministic forces as well as demographic, environmental and genetic stochastic events on wildlife populations and is widely used in population viability analysis of many wild animals but can it be used for the vulnerable human tribes population viability analysis too? Are there any such studies already done? What changes we have to make in the model for the same?
Please can any one help with the following: I am planning to use GA for function optimization using Java, and I believe I should use real encoding. However, I thought I could implement real encoding with the knowledge of binary encoding. But unfortunately, I am having difficulty implementing real coding.























































































































